This repository contains the code samples used during the presentation given by Travis Spencer to the Stockholm Java user group meeting on December 11, 2014.
The simples examples (1 -7) are all in seperate Java files. To test them, do the following:
- Build the code (
mvn package
). - Execute the example (e.g.,
java java_meetup_spark_demo.Example01_Hello_World
); this will start the Web API. - Run the corresponding script (e.g.,
scripts/Example01_Hello_World
) which will curl a request to the API.
All of these simple examples are based on the ones distributed with Spark.
In the Example08 source directory, you will find an example that adds some useful functionality on top of Spark:
- Pico container integration, including hierarchical dependency resolution
- Simplified programming model for creating controllers
- Syntactic sugar to beautify Spark's syntax for defining routes
The Pico integration is handled in the Application
class. On application startup, session creation, and request initiation, the ContainerComposer
's respective static methods will be called to wire-up any app-, session-, and request-level dependencies. These will be passed on to the controllers and other objects.
A controller is a class that implements the Controllable
interface. This interface defines all the methods that a Web API might want to respond to (e.g., get
, post
, etc.). This interface uses Java 8's new default
keyword on all of these methods, effectively making it an abstract class rather than an interface. Consequently, a controller only needs to override the methods it wishes to handle; the others will not be routed. For each method that is overloaded, the Router
(a derivative of Spark's router) will reflectively wire-up the overload to the given endpoint.
The net effect is that message handling for an endpoint is packaged up in a Controller class (e.g., see FooController
below), and the routes are kept together in the microservice's Application
class.
public class Application implements SparkApplication
{
private final DefaultPicoContainer appContainer;
public Application() throws ConfigurationException
{
appContainer = new DefaultPicoContainer();
ContainerComposer.composeApplication(appContainer);
}
@Override
public void init()
{
route("/login").to(LoginController.class, usingContainer(appContainer), renderedWith("login.vm"));
route("/foo").to(FooController.class, usingContainer(appContainer), renderedWith("foo.vm"));
// Etc.
}
}
public class FooController implements Controllable
{
@Override
public boolean before(Request request, Response response)
{
if (request.session(false) == null)
{
response.redirect("/login");
return false;
}
return true;
}
@Override
public boolean get(Request request, Response response, final Map<String, Object> model)
{
Map<String, String> data = new HashMap<>();
data.put("e1", "e1 value");
data.put("e2", "e2 value");
data.put("e3", "e3 value");
model.put("user", "tspencer");
model.put("data", data);
return true;
}
}
For more information, check out the Spark home page and read through the documentation. For questions or info on these examples or the micro-micro framework in Example08, contract Travis Spencer.