Skip to content
This repository has been archived by the owner on Jun 1, 2021. It is now read-only.

Commit

Permalink
Merge branch 'master' into r-0.8
Browse files Browse the repository at this point in the history
Conflicts:
	build.sbt
  • Loading branch information
krasserm committed Nov 10, 2016
2 parents 9a0a10d + d6f7bee commit 0200620
Show file tree
Hide file tree
Showing 177 changed files with 8,240 additions and 514 deletions.
3 changes: 3 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,13 @@ before_script:
- pip install --user sphinx
- pip install --user sphinx_rtd_theme
script:
- .travis/compile-all.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-core.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-leveldb.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-crdt.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-stream.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-spark.sh ++$TRAVIS_SCALA_VERSION
- .travis/test-vertx.sh ++$TRAVIS_SCALA_VERSION
- find $HOME/.sbt -name "*.lock" | xargs rm
- find $HOME/.ivy2 -name "ivydata-*.properties" | xargs rm
after_success:
Expand Down
3 changes: 3 additions & 0 deletions .travis/compile-all.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

sbt $1 test:compile
3 changes: 3 additions & 0 deletions .travis/test-stream.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/sh

.travis/test-template.sh $1 adapterStream
3 changes: 3 additions & 0 deletions .travis/test-vertx.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/sh

sbt $1 "adapterVertx/it:test"
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,16 @@
Eventuate
=========

Eventuate is a toolkit for building applications composed of event-driven and event-sourced services that collaborate by exchanging events over shared event logs. Services can either be co-located on a single node or distributed up to global scale. Services can also be replicated with causal consistency and remain available for writes during network partitions. Eventuate has a [Java](http://www.oracle.com/technetwork/java/javase/overview/index.html) and [Scala](http://www.scala-lang.org/) API, is written in Scala and built on top of [Akka](http://akka.io), a toolkit for building highly concurrent, distributed, and resilient message-driven applications on the JVM. Eventuate
Eventuate is a toolkit for building applications composed of event-driven and event-sourced services that communicate via causally ordered event streams. Services can either be co-located on a single node or distributed up to global scale. Services can also be replicated with causal consistency and remain available for writes during network partitions. Eventuate has a [Java](http://www.oracle.com/technetwork/java/javase/overview/index.html) and [Scala](http://www.scala-lang.org/) API, is written in Scala and built on top of [Akka](http://akka.io), a toolkit for building highly concurrent, distributed, and resilient message-driven applications on the JVM. Eventuate

- provides event-sourcing abstractions for building stateful services on the command-side and query-side of CQRS-based applications
- offers services a reliable and partition-tolerant event storage and event-based communication infrastructure that preserves causal ordering
- supports the development of *always-on* applications by allowing services to be distributed across multiple availability zones (locations)
- supports stateful service replication with causal consistency and concurrent state updates with automated and interactive conflict resolution options
- supports the implementation of reliable business processes from collaborating services that are tolerant to inter-service network partitions
- supports the aggregation of events from distributed services for updating persistent and in-memory query databases
- provides abstractions for building stateful event-sourced services, persistent and in-memory query databases and event processing pipelines
- enables services to communicate over a reliable and partition-tolerant event bus with causal event ordering and distribution up to global scale
- supports stateful service replication with causal consistency and concurrent state updates with automated and interactive conflict resolution
- provides implementations of operation-based CRDTs as specified in [A comprehensive study of Convergent and Commutative Replicated Data Types](http://hal.upmc.fr/file/index/docid/555588/filename/techreport.pdf)
- provides adapters to 3rd-party stream processing frameworks for analyzing generated events
- supports the development of *always-on* applications by allowing services to be distributed across multiple availability zones (locations)
- supports the implementation of reliable business processes from event-driven and command-driven service interactions
- supports the aggregation of events from distributed services for updating query databases
- provides adapters to 3rd-party stream processing frameworks for analyzing event streams

Documentation
-------------
Expand Down
145 changes: 144 additions & 1 deletion build.sbt
Original file line number Diff line number Diff line change
@@ -1 +1,144 @@
import sbt._import sbt.Keys._import sbtunidoc.Plugin.UnidocKeys._import MultiJvmKeys._import ProjectSettings._import ProjectDependencies._version in ThisBuild := "0.8-M2"organization in ThisBuild := "com.rbmhtechnology"scalaVersion in ThisBuild := "2.11.7"lazy val root = (project in file(".")) .aggregate(core, crdt, logCassandra, logLeveldb, adapterSpark, examples, exampleSpark) .dependsOn(core, logCassandra, logLeveldb) .settings(name := "eventuate") .settings(commonSettings: _*) .settings(documentationSettings: _*) .settings(unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(examples)) .settings(libraryDependencies ++= Seq(AkkaRemote)) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val core = (project in file("eventuate-core")) .settings(name := "eventuate-core") .settings(commonSettings: _*) .settings(protocSettings: _*) .settings(integrationTestSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, CommonsIo, Java8Compat, Scalaz)) .settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Javaslang % "test", JunitInterface % "test", Scalatest % "test,it")) .configs(IntegrationTest, MultiJvm) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val logCassandra = (project in file("eventuate-log-cassandra")) .dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm") .settings(name := "eventuate-log-cassandra") .settings(commonSettings: _*) .settings(integrationTestSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver)) .settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Log4jApi % "test,it", Log4jCore % "test,it", Log4jSlf4j % "test,it", Scalatest % "test,it", Sigar % "test,it")) .settings(libraryDependencies ++= Seq(CassandraUnit % "test,it" excludeAll ExclusionRule(organization = "ch.qos.logback"))) .settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4712") .configs(IntegrationTest, MultiJvm) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val logLeveldb = (project in file("eventuate-log-leveldb")) .dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm") .settings(name := "eventuate-log-leveldb") .settings(commonSettings: _*) .settings(integrationTestSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, Leveldb)) .settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it")) .settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4713") .configs(IntegrationTest, MultiJvm) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val adapterSpark = (project in file("eventuate-adapter-spark")) .dependsOn(logCassandra % "compile->compile;it->it;multi-jvm->multi-jvm") .dependsOn(logLeveldb % "compile->compile;it->it;multi-jvm->multi-jvm") .settings(name := "eventuate-adapter-spark") .settings(commonSettings: _*) .settings(integrationTestSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, CassandraClientUtil, CassandraConnector, SparkCore % "provided" exclude("org.slf4j", "slf4j-log4j12"), SparkSql % "provided" exclude("org.slf4j", "slf4j-log4j12"), SparkStreaming % "provided" exclude("org.slf4j", "slf4j-log4j12"))) .settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it", Sigar % "test,it")) .settings(libraryDependencies ++= Seq(CassandraUnit % "test,it" excludeAll ExclusionRule(organization = "ch.qos.logback"))) .settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4714") .configs(IntegrationTest, MultiJvm) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val crdt = (project in file("eventuate-crdt")) .dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm") .dependsOn(logLeveldb % "test;it->it;multi-jvm->multi-jvm") .settings(name := "eventuate-crdt") .settings(commonSettings: _*) .settings(protocSettings: _*) .settings(integrationTestSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote)) .settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it")) .settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4715") .configs(IntegrationTest, MultiJvm) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val examples = (project in file("eventuate-examples")) .dependsOn(core, logLeveldb) .settings(name := "eventuate-examples") .settings(commonSettings: _*) .settings(exampleSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver, Javaslang, Log4jApi, Log4jCore, Log4jSlf4j)) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)lazy val exampleSpark = (project in file("eventuate-example-spark")) .dependsOn(core, logCassandra, adapterSpark) .settings(name := "eventuate-example-spark") .settings(commonSettings: _*) .settings(exampleSettings: _*) .settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver, Log4jApi, Log4jCore, Log4jSlf4j, SparkCore exclude("org.slf4j", "slf4j-log4j12"), SparkSql exclude("org.slf4j", "slf4j-log4j12"), SparkStreaming exclude("org.slf4j", "slf4j-log4j12"))) .enablePlugins(HeaderPlugin, AutomateHeaderPlugin)
import sbt._
import sbt.Keys._
import sbtunidoc.Plugin.UnidocKeys._

import MultiJvmKeys._

import ProjectSettings._
import ProjectDependencies._

version in ThisBuild := "0.9-SNAPSHOT"

organization in ThisBuild := "com.rbmhtechnology"

scalaVersion in ThisBuild := "2.11.8"

lazy val root = (project in file("."))
.aggregate(core, crdt, logCassandra, logLeveldb, adapterSpark, adapterStream, adapterVertx, examples, exampleStream, exampleSpark, exampleVertx)
.dependsOn(core, logCassandra, logLeveldb)
.settings(name := "eventuate")
.settings(rootSettings: _*)
.settings(documentationSettings: _*)
.settings(unidocProjectFilter in (ScalaUnidoc, unidoc) := inAnyProject -- inProjects(examples, exampleStream, exampleSpark, exampleVertx))
.settings(libraryDependencies ++= Seq(AkkaRemote))
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)
.disablePlugins(SbtScalariform)

lazy val core = (project in file("eventuate-core"))
.settings(name := "eventuate-core")
.settings(commonSettings: _*)
.settings(protocSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, CommonsIo, Java8Compat, Scalaz))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Javaslang % "test", JunitInterface % "test", Scalatest % "test,it"))
.settings(integrationTestPublishSettings: _*)
.configs(IntegrationTest, MultiJvm)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val logCassandra = (project in file("eventuate-log-cassandra"))
.dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm")
.settings(name := "eventuate-log-cassandra")
.settings(commonSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Log4jApi % "test,it", Log4jCore % "test,it", Log4jSlf4j % "test,it", Scalatest % "test,it", Sigar % "test,it"))
.settings(libraryDependencies ++= Seq(CassandraUnit % "test,it" excludeAll ExclusionRule(organization = "ch.qos.logback")))
.settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4712")
.configs(IntegrationTest, MultiJvm)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val logLeveldb = (project in file("eventuate-log-leveldb"))
.dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm")
.settings(name := "eventuate-log-leveldb")
.settings(commonSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, Leveldb))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it"))
.settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4713")
.configs(IntegrationTest, MultiJvm)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val adapterSpark = (project in file("eventuate-adapter-spark"))
.dependsOn(logCassandra % "compile->compile;it->it;multi-jvm->multi-jvm")
.dependsOn(logLeveldb % "compile->compile;it->it;multi-jvm->multi-jvm")
.settings(name := "eventuate-adapter-spark")
.settings(commonSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, CassandraClientUtil, CassandraConnector,
SparkCore % "provided" exclude("org.slf4j", "slf4j-log4j12"),
SparkSql % "provided" exclude("org.slf4j", "slf4j-log4j12"),
SparkStreaming % "provided" exclude("org.slf4j", "slf4j-log4j12")))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it", Sigar % "test,it"))
.settings(libraryDependencies ++= Seq(CassandraUnit % "test,it" excludeAll ExclusionRule(organization = "ch.qos.logback")))
.settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4714")
.configs(IntegrationTest, MultiJvm)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val adapterStream = (project in file("eventuate-adapter-stream"))
.dependsOn(core % "compile->compile;it->it")
.dependsOn(logLeveldb % "it->it")
.settings(name := "eventuate-adapter-stream")
.settings(commonSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaStream))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaStreamTestkit % "test,it", Scalatest % "test,it"))
.configs(IntegrationTest)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val adapterVertx = (project in file("eventuate-adapter-vertx"))
.dependsOn(core % "compile->compile;it->it")
.dependsOn(logLeveldb % "it->it")
.settings(name := "eventuate-adapter-vertx")
.settings(commonSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote,
VertxCore % "provided",
VertxRxJava % "provided"))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", Scalatest % "test,it"))
.configs(IntegrationTest)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val crdt = (project in file("eventuate-crdt"))
.dependsOn(core % "compile->compile;it->it;multi-jvm->multi-jvm")
.dependsOn(logLeveldb % "test;it->it;multi-jvm->multi-jvm")
.settings(name := "eventuate-crdt")
.settings(commonSettings: _*)
.settings(protocSettings: _*)
.settings(integrationTestSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote))
.settings(libraryDependencies ++= Seq(AkkaTestkit % "test,it", AkkaTestkitMultiNode % "test", Scalatest % "test,it"))
.settings(jvmOptions in MultiJvm += "-Dmultinode.server-port=4715")
.configs(IntegrationTest, MultiJvm)
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val examples = (project in file("eventuate-examples"))
.dependsOn(core, logLeveldb)
.settings(name := "eventuate-examples")
.settings(commonSettings: _*)
.settings(exampleSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver, Javaslang, Log4jApi, Log4jCore, Log4jSlf4j))
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val exampleStream = (project in file("eventuate-example-stream"))
.dependsOn(core, logLeveldb, adapterStream)
.settings(name := "eventuate-example-stream")
.settings(commonSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, Log4jApi, Log4jCore, Log4jSlf4j))
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val exampleSpark = (project in file("eventuate-example-spark"))
.dependsOn(core, logCassandra, adapterSpark)
.settings(name := "eventuate-example-spark")
.settings(commonSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, CassandraDriver, Log4jApi, Log4jCore, Log4jSlf4j,
SparkCore exclude("org.slf4j", "slf4j-log4j12"),
SparkSql exclude("org.slf4j", "slf4j-log4j12"),
SparkStreaming exclude("org.slf4j", "slf4j-log4j12")))
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)

lazy val exampleVertx = (project in file("eventuate-example-vertx"))
.dependsOn(core, logLeveldb, adapterVertx)
.settings(name := "eventuate-example-vertx")
.settings(commonSettings: _*)
.settings(libraryDependencies ++= Seq(AkkaRemote, Leveldb, Javaslang, Log4jApi, Log4jCore, Log4jSlf4j, ExampleVertxCore, ExampleVertxRxJava))
.enablePlugins(HeaderPlugin, AutomateHeaderPlugin)
Original file line number Diff line number Diff line change
Expand Up @@ -65,12 +65,12 @@ class SparkBatchAdapterSpec extends TestKit(ActorSystem("test")) with WordSpecLi
"read events from a local event log" in {
val writtenEvents = writeEvents("a")
val readEvents = sparkAdapter.eventBatch(logId)
readEvents.sortBy(_.localSequenceNr).collect() should be (writtenEvents)
readEvents.sortBy(_.localSequenceNr).collect() should be(writtenEvents)
}
"read events from a local event log starting from a given sequence number" in {
val writtenEvents = writeEvents("a")
val readEvents = sparkAdapter.eventBatch(logId, 23L)
readEvents.sortBy(_.localSequenceNr).collect() should be (writtenEvents.drop(22))
readEvents.sortBy(_.localSequenceNr).collect() should be(writtenEvents.drop(22))
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -86,9 +86,9 @@ class SparkStreamAdapterSpec extends WordSpec with Matchers with MultiLocationSp
val writer = new EventLogWriter("writer", log)
implicit val dispatcher = system.dispatcher
def go(i: Int): Future[List[DurableEvent]] = for {
_ <- pattern.after(interval, system.scheduler)(Future.successful(Nil))
_ <- pattern.after(interval, system.scheduler)(Future.successful(Nil))
es1 <- writer.write(Seq(s"$prefix-$i"))
es2 <- if (i < num) go(i+1) else Future.successful(Nil)
es2 <- if (i < num) go(i + 1) else Future.successful(Nil)
} yield es1.head :: es2
go(0)
}
Expand Down
Loading

0 comments on commit 0200620

Please sign in to comment.