Announcing Akka 24.05: More Security. More Performance. More Efficiency. Register for the Webinar
Support
microservices lagom reactive

How to create greenfield Microservices with Lagom

Meet Lagom, the opinionated Microservices framework

Watch on YouTube

It seems like everyone is talking, writing, and thinking about Microservices architecture these days. Still, there is more confusion around what it is, and how to deliver on it, than ever. Truth be told, we have been spoiled by the once-believed almighty Monolith, with its single SQL database, for way too long. Today’s applications are deployed to everything from mobile devices to cloud-based clusters running thousands of multi-core processors. Consumers (human and systems) expect millisecond response times and close to 100% uptime. Traditional Java EE architectures on industry standard middleware simply don’t cut it anymore. We can’t make the horse faster anymore; we need cars for where we are going.

Instead of operating too much of what few people need, the ideal combination of relevant features with as little overhead as possible would be just the right thing. And this is where the recently released Lagom (Swedish for “just right”) places emphasis.

Why Lagom

Lagom is an opinionated framework for guiding JVM developers towards creating Reactive, Microservice-based systems. Lagom’s design rests on the principles outlined by Jonas Bonér in his recent O’Reilly report, Reactive Microservices Architecture: Design Patterns for Distributed Systems.

Lagom is asynchronous by default. All APIs use the asynchronous, non-blocking I/O capabilities of Akka Streams for asynchronous streaming, and the JDK8 CompletionStage API for asynchronous computation.

Lagom favours distributed persistent patterns, in contrast to the traditional centralized database. Event Sourcing (ES) with Command Query Responsibility Segregation (CQRS) is the default way to persist entities with.

The development environment is particularly important in developer productivity, both while developing and maintaining it. Lagom’s expressive service interface declarations let developers quickly define interfaces and immediately start implementing them. In a system with many services, developers should not be spending time updating their own environment to ensure that services are configured to run correctly.

Currently in MVP, Lagom offers Java developers four main features:

  • Service API - this provides a way to declare and implement service interfaces for consumption by clients. For location transparency, clients use stable addresses and discover services through a Service Locator. The Service API supports asynchronous messaging and streaming by default, as well as synchronous request-response calls between services.

  • Persistence API - this provides event-sourced persistent entities for services that store data, with Command Query Responsibility Segregation (CQRS) read-side support for queries. Lagom manages the distribution of persisted entities across a cluster of nodes, enabling sharding and horizontal scaling, with Apache Cassandra as the default database backend.

  • Hot-reload Development Environment - enables developers to run all services along with the supporting Lagom infrastructure with one command. Borrowing from Play Framework, Lagom hot-reloads code changes to services whenever updates are made.

  • Production Readiness - Services can be deployed as they are directly to production using ConductR (part of Reactive Platform’s commercial subscription). In production, Reactive Platform also provides monitoring, deployment orchestration and scaling of Lagom services in a container environment—and no additional infrastructure is needed.

Here is a brief inside view of Lagom:

Installing and working with Lagom

Lagom consists of an sbt plugin, libraries and optionally provides Activator project templates which makes it easier to get started. The first step is to get JDK 8 and Activator installed. Look at Lagom’s Quick Setup Guide for more details on how to do it.

A Lagom system is typically made up of a set of sbt builds, each build providing multiple services. The easiest way to get started with Lagom is to explore the Twitter-like example application “Chirper”.

$ activator new my-chirper lagom-java-chirper

Lagom includes a development environment that let you start all your services by simply typing runAll in the activator console. To run my-chirper from the command line, input cd my-chirper then adjust the JVM parameters via environment variable and run the example.

$ export SBT_OPTS="-Xmx2G -Xss4M -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled";

$ activator runAll

$ cd my-chirper
$ activator runAll
... (booting up)
[info] Starting embedded Cassandra server
..........
[info] Cassandra server running at 127.0.0.1:4000
[info] Service locator is running at http://localhost:8000
[info] Service gateway is running at http://localhost:9000
[info] Service chirp-impl listening for HTTP on 0:0:0:0:0:0:0:0:23966
[info] Service front-end listening for HTTP on 0:0:0:0:0:0:0:0:28118
[info] Service load-test-impl listening for HTTP on 0:0:0:0:0:0:0:0:21360
[info] Service activity-stream-impl listening for HTTP on 0:0:0:0:0:0:0:0:27462
[info] Service friend-impl listening for HTTP on 0:0:0:0:0:0:0:0:21998
[info] (Services started, use Ctrl+D to stop and go back to the console...)

Verify that you can access Chirper by pointing your browser to the service gateway http://localhost:9000. Click the orange “sign-up” button to create a user and chirp away:

Now, that you can run the example application locally, it is time to visit GitHub to fork and clone the sources and examine a little bit about what is going on under the hood. More followup blog-posts will introduce you to the basic concepts further.

Lagom is open source...so get started today!

If you want to learn more about Lagom, make sure to give it a test drive. The Lagom home page features several introductory videos which give you a great headstart, plus there is documentation, a mailing list and Gitter and Twitter profiles to interact with:

Want to talk to someone about using Lagom (or Akka, Play and Spark) in production? Set up a quick 15-min call with a Lightbend representative below:

SCHEDULE A 15-MIN CHAT

 

The Total Economic Impact™
Of Lightbend Akka

  • 139% ROI
  • 50% to 75% faster time-to-market
  • 20x increase in developer throughput
  • <6 months Akka pays for itself