!osg, pro Updated OSGi dependencies and using a BundleDelegatingClassLoader
* Numerous version upgrades to dependencies * Publish integration test maven dependencies automatically * Use slf4j logging, with standard OPS4j Pax Logging * Maven and sbt use different target directories * Add a custom serializer to the integration test * Add akka-persistence to the integration test * Removing reference copying and using a BundleDelegatingClassLoader * Make akka-actor a proper bundle, and remove duplicate classes and reference.conf files from akka-osgi * Remove akka-osgi-aries This is a hackathon team effort by: * Björn Antonsson * Endre Sándor Varga * Roland Kuhn * Patrik Nordwall
This commit is contained in:
parent
ef261e8128
commit
e98dd13cd6
42 changed files with 497 additions and 854 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -67,6 +67,7 @@ src_managed
|
|||
storage
|
||||
tags
|
||||
target
|
||||
target-sbt
|
||||
tm*.lck
|
||||
tm*.log
|
||||
tm.out
|
||||
|
|
|
|||
|
|
@ -1,14 +1,6 @@
|
|||
Akka in OSGi
|
||||
============
|
||||
|
||||
In an OSGi environment the ``akka-osgi`` bundle replaces ``akka-actor`` artifact. It includes all classes from ``akka-actor`` and merged ``reference.conf`` files from all akka modules. The dependency is::
|
||||
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-osgi_@binVersion@</artifactId>
|
||||
<version>@version@</version>
|
||||
</dependency>
|
||||
|
||||
Configuring the OSGi Framework
|
||||
------------------------------
|
||||
|
||||
|
|
@ -19,26 +11,20 @@ instead of resolving it through the normal OSGi class space.
|
|||
Activator
|
||||
---------
|
||||
|
||||
To bootstrap Akka inside an OSGi environment, you can use the ``akka.osgi.AkkaSystemActivator`` class
|
||||
To bootstrap Akka inside an OSGi environment, you can use the ``akka.osgi.ActorSystemActivator`` class
|
||||
to conveniently set up the ActorSystem.
|
||||
|
||||
.. includecode:: code/osgi/Activator.scala#Activator
|
||||
|
||||
The ``AkkaSystemActivator`` class is included in the ``akka-osgi`` artifact.
|
||||
|
||||
Blueprint
|
||||
---------
|
||||
The ``ActorSystemActivator`` creates the actor system with a class loader that finds resources
|
||||
(``reference.conf`` files) and classes from the application bundle and all transitive dependencies.
|
||||
|
||||
For the Apache Aries Blueprint implementation, there's also a namespace handler available. The namespace URI
|
||||
is http://akka.io/xmlns/blueprint/v1.0.0 and it can be used to set up an ActorSystem.
|
||||
|
||||
.. includecode:: code/osgi/blueprint.xml
|
||||
|
||||
The blueprint is included in the ``akka-osgi-aries`` artifact::
|
||||
The ``ActorSystemActivator`` class is included in the ``akka-osgi`` artifact.
|
||||
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-osgi-aries_@binVersion@</artifactId>
|
||||
<artifactId>akka-osgi_@binVersion@</artifactId>
|
||||
<version>@version@</version>
|
||||
</dependency>
|
||||
|
||||
|
|
|
|||
|
|
@ -245,4 +245,9 @@ ReliableProxy Constructor Changed
|
|||
|
||||
The constructor of ``ReliableProxy`` in ``akka-contrib`` has been changed to take an ``ActorPath`` instead of
|
||||
an ``ActorRef``. Also it takes new parameters to support reconnection. Use the new props factory methods, ``ReliableProxy.props``.
|
||||
|
||||
Akka OSGi Aries Blueprint is Removed
|
||||
====================================
|
||||
|
||||
``akka-osgi-aries`` has been removed. Similar can be implemented outside of Akka if needed.
|
||||
|
||||
|
|
@ -1,18 +0,0 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
|
||||
|
||||
<!--
|
||||
Register the namespace handler in the OSGi Service Registry
|
||||
-->
|
||||
<service interface="org.apache.aries.blueprint.NamespaceHandler">
|
||||
<service-properties>
|
||||
<entry key="osgi.service.blueprint.namespace">
|
||||
<array>
|
||||
<value>http://akka.io/xmlns/blueprint/v1.0.0</value>
|
||||
</array>
|
||||
</entry>
|
||||
</service-properties>
|
||||
<bean class="akka.osgi.aries.blueprint.NamespaceHandler"/>
|
||||
</service>
|
||||
|
||||
</blueprint>
|
||||
|
|
@ -1,42 +0,0 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<xsd:schema xmlns="http://akka.io/xmlns/blueprint/v1.0.0"
|
||||
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
|
||||
xmlns:bp="http://www.osgi.org/xmlns/blueprint/v1.0.0"
|
||||
targetNamespace="http://akka.io/xmlns/blueprint/v1.0.0"
|
||||
elementFormDefault="qualified"
|
||||
attributeFormDefault="unqualified">
|
||||
|
||||
<xsd:import namespace="http://www.osgi.org/xmlns/blueprint/v1.0.0"/>
|
||||
|
||||
<xsd:annotation>
|
||||
<xsd:documentation>
|
||||
Defines the configuration elements for setting up Akka with Blueprint
|
||||
</xsd:documentation>
|
||||
</xsd:annotation>
|
||||
|
||||
<xsd:element name="actor-system">
|
||||
<xsd:complexType>
|
||||
<xsd:annotation>
|
||||
<xsd:documentation>
|
||||
Defines an Akka ActorSystem
|
||||
</xsd:documentation>
|
||||
</xsd:annotation>
|
||||
<xsd:sequence>
|
||||
<xsd:choice minOccurs="0" maxOccurs="unbounded">
|
||||
<xsd:element ref="config" />
|
||||
</xsd:choice>
|
||||
</xsd:sequence>
|
||||
<xsd:attribute name="id" type="xsd:ID"/>
|
||||
<xsd:attribute name="name" type="xsd:string" use="optional"/>
|
||||
</xsd:complexType>
|
||||
</xsd:element>
|
||||
|
||||
<xsd:element name="config" type="xsd:string">
|
||||
<xsd:annotation>
|
||||
<xsd:documentation>
|
||||
Defines an Akka ActorSystem configuration
|
||||
</xsd:documentation>
|
||||
</xsd:annotation>
|
||||
</xsd:element>
|
||||
|
||||
</xsd:schema>
|
||||
|
|
@ -1,42 +0,0 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi.aries.blueprint
|
||||
|
||||
import org.osgi.framework.BundleContext
|
||||
import akka.osgi.OsgiActorSystemFactory
|
||||
import akka.actor.ActorSystem
|
||||
import com.typesafe.config.{ Config, ConfigFactory }
|
||||
|
||||
/**
|
||||
* A set of helper/factory classes to build a Akka system using Blueprint. This class is only meant to be used by
|
||||
* the [[akka.osgi.aries.blueprint.NamespaceHandler]] class, you should not use this class directly.
|
||||
*
|
||||
* If you're looking for a way to set up Akka using Blueprint without the namespace handler, you should use
|
||||
* [[akka.osgi.OsgiActorSystemFactory]] instead.
|
||||
*/
|
||||
class BlueprintActorSystemFactory(context: BundleContext, name: String, fallbackClassLoader: Option[ClassLoader]) extends OsgiActorSystemFactory(context, fallbackClassLoader) {
|
||||
|
||||
def this(context: BundleContext, name: String) = this(context, name, Some(OsgiActorSystemFactory.akkaActorClassLoader))
|
||||
|
||||
var config: Option[String] = None
|
||||
|
||||
lazy val system: ActorSystem = super.createActorSystem(if (name == null || name.isEmpty) None else Some(name))
|
||||
|
||||
def setConfig(config: String): Unit = this.config = Some(config)
|
||||
|
||||
def create(): ActorSystem = system
|
||||
|
||||
def destroy(): Unit = system.shutdown()
|
||||
|
||||
/**
|
||||
* Strategy method to create the Config for the ActorSystem, ensuring that the default/reference configuration is
|
||||
* loaded from the akka-actor bundle.
|
||||
*/
|
||||
override def actorSystemConfig(context: BundleContext): Config =
|
||||
config match {
|
||||
case Some(value) ⇒ ConfigFactory.parseString(value).withFallback(super.actorSystemConfig(context))
|
||||
case None ⇒ super.actorSystemConfig(context)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1,133 +0,0 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi.aries.blueprint
|
||||
|
||||
import org.osgi.framework.BundleContext
|
||||
import org.osgi.service.blueprint.container.ComponentDefinitionException
|
||||
import org.osgi.service.blueprint.reflect.{ BeanMetadata, ComponentMetadata }
|
||||
import org.apache.aries.blueprint.ParserContext
|
||||
import org.apache.aries.blueprint.mutable.MutableBeanMetadata
|
||||
import org.apache.aries.blueprint.reflect.{ ValueMetadataImpl, RefMetadataImpl, BeanArgumentImpl }
|
||||
import org.w3c.dom.{ Element, Node }
|
||||
import java.util.concurrent.atomic.AtomicInteger
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import scala.annotation.tailrec
|
||||
import java.net.URL
|
||||
|
||||
object NamespaceHandler {
|
||||
private val ID_ATTRIBUTE = "id"
|
||||
private val NAME_ATTRIBUTE = "name"
|
||||
|
||||
private val BUNDLE_CONTEXT_REFID = "blueprintBundleContext"
|
||||
|
||||
private val ACTORSYSTEM_ELEMENT_NAME = "actor-system"
|
||||
private val CONFIG_ELEMENT_NAME = "config"
|
||||
|
||||
private val DESTROY_METHOD_NAME = "destroy"
|
||||
private val FACTORY_METHOD_NAME = "create"
|
||||
}
|
||||
|
||||
/**
|
||||
* Aries Blueprint namespace handler implementation. This namespace handler will allow users of Apache Aries' Blueprint
|
||||
* implementation to define their Akka [[akka.actor.ActorSystem]] using a syntax like this:
|
||||
*
|
||||
* {{{
|
||||
* <?xml version="1.0" encoding="UTF-8"?>
|
||||
* <blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
|
||||
* xmlns:akka="http://akka.io/xmlns/blueprint/v1.0.0">
|
||||
*
|
||||
* <akka:actor-system name="config">
|
||||
* <akka:config>
|
||||
* some.config {
|
||||
* key=value
|
||||
* }
|
||||
* </akka:config>
|
||||
* </akka:actor-system>
|
||||
*
|
||||
* </blueprint>
|
||||
* }}}
|
||||
*
|
||||
* Users of other IoC frameworks in an OSGi environment should use [[akka.osgi.OsgiActorSystemFactory]] instead.
|
||||
*/
|
||||
class NamespaceHandler extends org.apache.aries.blueprint.NamespaceHandler {
|
||||
|
||||
import NamespaceHandler._
|
||||
|
||||
protected val idCounter = new AtomicInteger(0)
|
||||
|
||||
override def getSchemaLocation(namespace: String): URL = getClass().getResource("akka.xsd")
|
||||
|
||||
override def getManagedClasses = java.util.Collections.singleton(classOf[BlueprintActorSystemFactory])
|
||||
|
||||
override def parse(element: Element, context: ParserContext) =
|
||||
if (element.getLocalName == ACTORSYSTEM_ELEMENT_NAME) parseActorSystem(element, context)
|
||||
else throw new ComponentDefinitionException("Unexpected element for Akka namespace: %s".format(element))
|
||||
|
||||
override def decorate(node: Node, component: ComponentMetadata, context: ParserContext) =
|
||||
throw new ComponentDefinitionException("Bad xml syntax: node decoration is not supported")
|
||||
|
||||
/*
|
||||
* Parse <akka:actor-system/>
|
||||
*/
|
||||
def parseActorSystem(element: Element, context: ParserContext) = {
|
||||
val factory = createFactoryBean(context, element.getAttribute(NAME_ATTRIBUTE))
|
||||
|
||||
val nodelist = element.getChildNodes
|
||||
(0 until nodelist.getLength) collect {
|
||||
case idx if nodelist.item(idx).getNodeType == Node.ELEMENT_NODE ⇒ nodelist.item(idx).asInstanceOf[Element]
|
||||
} foreach {
|
||||
case child if child.getLocalName == CONFIG_ELEMENT_NAME ⇒ parseConfig(child, context, factory)
|
||||
case child ⇒ throw new ComponentDefinitionException("Unexpected child element %s found in %s".format(child, element))
|
||||
}
|
||||
|
||||
createActorSystemBean(context, element, factory)
|
||||
}
|
||||
|
||||
/*
|
||||
* Parse <akka:config/>
|
||||
*/
|
||||
protected def parseConfig(node: Element, context: ParserContext, factory: MutableBeanMetadata) =
|
||||
factory.addProperty(CONFIG_ELEMENT_NAME, new ValueMetadataImpl(node.getTextContent))
|
||||
|
||||
@tailrec protected final def findAvailableId(context: ParserContext): String =
|
||||
".akka-" + idCounter.incrementAndGet() match {
|
||||
case id if context.getComponentDefinitionRegistry.containsComponentDefinition(id) ⇒ findAvailableId(context)
|
||||
case available ⇒ available
|
||||
}
|
||||
|
||||
/*
|
||||
* Create the bean definition for the ActorSystem
|
||||
*/
|
||||
protected def createActorSystemBean(context: ParserContext, element: Element, factory: MutableBeanMetadata): MutableBeanMetadata = {
|
||||
val system = context.createMetadata(classOf[MutableBeanMetadata])
|
||||
val id = if (element.hasAttribute(ID_ATTRIBUTE)) element.getAttribute(ID_ATTRIBUTE) else findAvailableId(context)
|
||||
|
||||
system.setId(id)
|
||||
system.setFactoryComponent(factory)
|
||||
|
||||
system.setFactoryMethod(FACTORY_METHOD_NAME)
|
||||
system.setRuntimeClass(classOf[ActorSystem])
|
||||
system
|
||||
}
|
||||
|
||||
/*
|
||||
* Create the bean definition for the BlueprintActorSystemFactory
|
||||
*/
|
||||
protected def createFactoryBean(context: ParserContext, name: String): MutableBeanMetadata = {
|
||||
val factory = context.createMetadata(classOf[MutableBeanMetadata])
|
||||
factory.setId(findAvailableId(context))
|
||||
factory.setScope(BeanMetadata.SCOPE_SINGLETON)
|
||||
factory.setProcessor(true)
|
||||
factory.setRuntimeClass(classOf[BlueprintActorSystemFactory])
|
||||
|
||||
factory.setDestroyMethod(DESTROY_METHOD_NAME)
|
||||
|
||||
factory.addArgument(new BeanArgumentImpl(new RefMetadataImpl(BUNDLE_CONTEXT_REFID), classOf[BundleContext].getName, -1))
|
||||
factory.addArgument(new BeanArgumentImpl(new ValueMetadataImpl(name), classOf[String].getName, -1))
|
||||
factory.setProcessor(true)
|
||||
context.getComponentDefinitionRegistry.registerComponentDefinition(factory)
|
||||
factory
|
||||
}
|
||||
}
|
||||
|
|
@ -1,15 +0,0 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
|
||||
xmlns:akka="http://akka.io/xmlns/blueprint/v1.0.0">
|
||||
|
||||
<service ref="system" interface="akka.actor.ActorSystem" />
|
||||
|
||||
<akka:actor-system id="system" name="config">
|
||||
<akka:config>
|
||||
some.config {
|
||||
key=value
|
||||
}
|
||||
</akka:config>
|
||||
</akka:actor-system>
|
||||
|
||||
</blueprint>
|
||||
|
|
@ -1,13 +0,0 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
|
||||
xmlns:akka="http://akka.io/xmlns/blueprint/v1.0.0">
|
||||
|
||||
<service interface="akka.osgi.aries.blueprint.ActorSystemAwareBean">
|
||||
<bean class="akka.osgi.aries.blueprint.ActorSystemAwareBean">
|
||||
<argument>
|
||||
<akka:actor-system name="simple" />
|
||||
</argument>
|
||||
</bean>
|
||||
</service>
|
||||
|
||||
</blueprint>
|
||||
|
|
@ -1,9 +0,0 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
|
||||
xmlns:akka="http://akka.io/xmlns/blueprint/v1.0.0">
|
||||
|
||||
<service ref="system" interface="akka.actor.ActorSystem" />
|
||||
|
||||
<akka:actor-system id="system" name="simple" />
|
||||
|
||||
</blueprint>
|
||||
|
|
@ -1,14 +0,0 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi.aries.blueprint
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
|
||||
/**
|
||||
* Just a simple POJO that can contain an actor system.
|
||||
* Used for testing dependency injection with Blueprint
|
||||
*/
|
||||
class ActorSystemAwareBean(val system: ActorSystem) {
|
||||
|
||||
}
|
||||
|
|
@ -1,112 +0,0 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi.aries.blueprint
|
||||
|
||||
import org.scalatest.WordSpec
|
||||
import akka.actor.ActorSystem
|
||||
import de.kalpatec.pojosr.framework.launch.BundleDescriptor
|
||||
import akka.osgi.PojoSRTestSupport
|
||||
import akka.osgi.PojoSRTestSupport.bundle
|
||||
import org.scalatest.Matchers
|
||||
|
||||
/**
|
||||
* Test cases for {@link ActorSystemActivator}
|
||||
*/
|
||||
object NamespaceHandlerTest {
|
||||
|
||||
/*
|
||||
* Bundle-SymbolicName to easily find our test bundle
|
||||
*/
|
||||
val TEST_BUNDLE_NAME = "akka.osgi.test.aries.namespace"
|
||||
|
||||
/*
|
||||
* Bundle descriptor representing the akka-osgi bundle itself
|
||||
*/
|
||||
val AKKA_OSGI_BLUEPRINT =
|
||||
bundle("akka-osgi").withBlueprintFile(getClass.getResource("/OSGI-INF/blueprint/akka-namespacehandler.xml"))
|
||||
|
||||
}
|
||||
|
||||
class SimpleNamespaceHandlerTest extends WordSpec with Matchers with PojoSRTestSupport {
|
||||
|
||||
import NamespaceHandlerTest._
|
||||
|
||||
val testBundles = buildTestBundles(List(
|
||||
AKKA_OSGI_BLUEPRINT,
|
||||
bundle(TEST_BUNDLE_NAME).withBlueprintFile(getClass.getResource("simple.xml"))))
|
||||
|
||||
"simple.xml" must {
|
||||
"set up ActorSystem when bundle starts" in {
|
||||
filterErrors() {
|
||||
serviceForType[ActorSystem] should not be (null)
|
||||
}
|
||||
}
|
||||
|
||||
"stop the ActorSystem when bundle stops" in {
|
||||
filterErrors() {
|
||||
val system = serviceForType[ActorSystem]
|
||||
system.isTerminated should be(false)
|
||||
|
||||
bundleForName(TEST_BUNDLE_NAME).stop()
|
||||
|
||||
system.awaitTermination()
|
||||
system.isTerminated should be(true)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
class ConfigNamespaceHandlerTest extends WordSpec with Matchers with PojoSRTestSupport {
|
||||
|
||||
import NamespaceHandlerTest._
|
||||
|
||||
val testBundles = buildTestBundles(List(
|
||||
AKKA_OSGI_BLUEPRINT,
|
||||
bundle(TEST_BUNDLE_NAME).withBlueprintFile(getClass.getResource("config.xml"))))
|
||||
|
||||
"config.xml" must {
|
||||
"set up ActorSystem when bundle starts" in {
|
||||
filterErrors() {
|
||||
val system = serviceForType[ActorSystem]
|
||||
system should not be (null)
|
||||
system.settings.config.getString("some.config.key") should be("value")
|
||||
}
|
||||
}
|
||||
|
||||
"stop the ActorSystem when bundle stops" in {
|
||||
filterErrors() {
|
||||
val system = serviceForType[ActorSystem]
|
||||
system.isTerminated should be(false)
|
||||
|
||||
bundleForName(TEST_BUNDLE_NAME).stop()
|
||||
|
||||
system.awaitTermination()
|
||||
system.isTerminated should be(true)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
class DependencyInjectionNamespaceHandlerTest extends WordSpec with Matchers with PojoSRTestSupport {
|
||||
|
||||
import NamespaceHandlerTest._
|
||||
|
||||
val testBundles = buildTestBundles(List(
|
||||
AKKA_OSGI_BLUEPRINT,
|
||||
bundle(TEST_BUNDLE_NAME).withBlueprintFile(getClass.getResource("injection.xml"))))
|
||||
|
||||
"injection.xml" must {
|
||||
|
||||
"set up bean containing ActorSystem" in {
|
||||
filterErrors() {
|
||||
val bean = serviceForType[ActorSystemAwareBean]
|
||||
bean should not be (null)
|
||||
bean.system should not be (null)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -21,7 +21,7 @@ import com.typesafe.config.{ ConfigFactory, Config }
|
|||
abstract class ActorSystemActivator extends BundleActivator {
|
||||
|
||||
private var system: Option[ActorSystem] = None
|
||||
private var registration: Option[ServiceRegistration] = None
|
||||
private var registration: Option[ServiceRegistration[_]] = None
|
||||
|
||||
/**
|
||||
* Implement this method to add your own actors to the ActorSystem. If you want to share the actor
|
||||
|
|
@ -72,7 +72,7 @@ abstract class ActorSystemActivator extends BundleActivator {
|
|||
/**
|
||||
* Convenience method to find a service by its reference.
|
||||
*/
|
||||
def serviceForReference[T](context: BundleContext, reference: ServiceReference): T =
|
||||
def serviceForReference[T](context: BundleContext, reference: ServiceReference[_]): T =
|
||||
context.getService(reference).asInstanceOf[T]
|
||||
|
||||
/**
|
||||
|
|
|
|||
|
|
@ -0,0 +1,95 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi
|
||||
|
||||
import language.existentials
|
||||
import java.net.URL
|
||||
import java.util.Enumeration
|
||||
import org.osgi.framework.{ BundleContext, Bundle }
|
||||
import scala.util.Try
|
||||
import scala.io.Source
|
||||
import org.osgi.framework.wiring.{ BundleRevision, BundleWire, BundleWiring }
|
||||
import scala.collection.JavaConverters._
|
||||
import scala.util.Success
|
||||
import scala.util.Failure
|
||||
import scala.annotation.tailrec
|
||||
|
||||
/*
|
||||
* Companion object to create bundle delegating ClassLoader instances
|
||||
*/
|
||||
object BundleDelegatingClassLoader {
|
||||
|
||||
/*
|
||||
* Create a bundle delegating ClassLoader for the bundle context's bundle
|
||||
*/
|
||||
def apply(context: BundleContext): BundleDelegatingClassLoader = new BundleDelegatingClassLoader(context.getBundle, null)
|
||||
|
||||
def apply(context: BundleContext, fallBackCLassLoader: Option[ClassLoader]): BundleDelegatingClassLoader =
|
||||
new BundleDelegatingClassLoader(context.getBundle, fallBackCLassLoader.orNull)
|
||||
}
|
||||
|
||||
/*
|
||||
* A bundle delegating ClassLoader implementation - this will try to load classes and resources from the bundle
|
||||
* and the bundles transitive dependencies. If there's a ClassLoader specified, that will be used as a fallback.
|
||||
*/
|
||||
class BundleDelegatingClassLoader(bundle: Bundle, fallBackClassLoader: ClassLoader) extends ClassLoader(fallBackClassLoader) {
|
||||
|
||||
private val bundles = findTransitiveBundles(bundle).toList
|
||||
|
||||
override def findClass(name: String): Class[_] = {
|
||||
@tailrec def find(remaining: List[Bundle]): Class[_] = {
|
||||
if (remaining.isEmpty) throw new ClassNotFoundException(name)
|
||||
else Try { remaining.head.loadClass(name) } match {
|
||||
case Success(cls) ⇒ cls
|
||||
case Failure(_) ⇒ find(remaining.tail)
|
||||
}
|
||||
}
|
||||
find(bundles)
|
||||
}
|
||||
|
||||
override def findResource(name: String): URL = {
|
||||
@tailrec def find(remaining: List[Bundle]): URL = {
|
||||
if (remaining.isEmpty) getParent.getResource(name)
|
||||
else Option { remaining.head.getResource(name) } match {
|
||||
case Some(r) ⇒ r
|
||||
case None ⇒ find(remaining.tail)
|
||||
}
|
||||
}
|
||||
find(bundles)
|
||||
}
|
||||
|
||||
override def findResources(name: String): Enumeration[URL] = {
|
||||
val resources = bundles.flatMap {
|
||||
bundle ⇒ Option(bundle.getResources(name)).map { _.asScala.toList }.getOrElse(Nil)
|
||||
}
|
||||
java.util.Collections.enumeration(resources.asJava)
|
||||
}
|
||||
|
||||
private def findTransitiveBundles(bundle: Bundle): Set[Bundle] = {
|
||||
@tailrec def process(processed: Set[Bundle], remaining: Set[Bundle]): Set[Bundle] = {
|
||||
if (remaining.isEmpty) {
|
||||
processed
|
||||
} else {
|
||||
val (b, rest) = (remaining.head, remaining.tail)
|
||||
if (processed contains b) {
|
||||
process(processed, rest)
|
||||
} else {
|
||||
val wiring = b.adapt(classOf[BundleWiring])
|
||||
val direct: Set[Bundle] =
|
||||
if (wiring == null) Set.empty
|
||||
else {
|
||||
val requiredWires: List[BundleWire] =
|
||||
wiring.getRequiredWires(BundleRevision.PACKAGE_NAMESPACE).asScala.toList
|
||||
requiredWires.flatMap {
|
||||
wire ⇒ Option(wire.getProviderWiring) map { _.getBundle }
|
||||
}.toSet
|
||||
}
|
||||
process(processed + b, rest ++ (direct -- processed))
|
||||
}
|
||||
}
|
||||
}
|
||||
process(Set.empty, Set(bundle))
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -3,7 +3,6 @@
|
|||
*/
|
||||
package akka.osgi
|
||||
|
||||
import impl.BundleDelegatingClassLoader
|
||||
import akka.actor.ActorSystem
|
||||
import com.typesafe.config.{ ConfigFactory, Config }
|
||||
import org.osgi.framework.BundleContext
|
||||
|
|
@ -18,7 +17,7 @@ class OsgiActorSystemFactory(val context: BundleContext, val fallbackClassLoader
|
|||
/*
|
||||
* Classloader that delegates to the bundle for which the factory is creating an ActorSystem
|
||||
*/
|
||||
private val classloader = new BundleDelegatingClassLoader(context.getBundle, fallbackClassLoader)
|
||||
private val classloader = BundleDelegatingClassLoader(context, fallbackClassLoader)
|
||||
|
||||
/**
|
||||
* Creates the [[akka.actor.ActorSystem]], using the name specified
|
||||
|
|
@ -55,7 +54,7 @@ object OsgiActorSystemFactory {
|
|||
/**
|
||||
* Class loader of akka-actor bundle.
|
||||
*/
|
||||
def akkaActorClassLoader = classOf[ActorSystem].getClassLoader
|
||||
def akkaActorClassLoader = classOf[ActorSystemActivator].getClassLoader
|
||||
|
||||
/*
|
||||
* Create an [[OsgiActorSystemFactory]] instance to set up Akka in an OSGi environment
|
||||
|
|
|
|||
|
|
@ -1,60 +0,0 @@
|
|||
/**
|
||||
* Copyright (C) 2009-2013 Typesafe Inc. <http://www.typesafe.com>
|
||||
*/
|
||||
package akka.osgi.impl
|
||||
|
||||
import language.existentials
|
||||
|
||||
import java.net.URL
|
||||
import java.util.Enumeration
|
||||
import org.osgi.framework.{ BundleContext, Bundle }
|
||||
|
||||
/*
|
||||
* Companion object to create bundle delegating classloader instances
|
||||
*/
|
||||
object BundleDelegatingClassLoader {
|
||||
|
||||
/*
|
||||
* Create a bundle delegating classloader for the bundle context's bundle
|
||||
*/
|
||||
def apply(context: BundleContext): BundleDelegatingClassLoader = new BundleDelegatingClassLoader(context.getBundle)
|
||||
|
||||
}
|
||||
|
||||
/*
|
||||
* A bundle delegating classloader implemenation - this will try to load classes and resources from the bundle
|
||||
* specified first and if there's a classloader specified, that will be used as a fallback
|
||||
*/
|
||||
class BundleDelegatingClassLoader(bundle: Bundle, classLoader: Option[ClassLoader]) extends ClassLoader {
|
||||
|
||||
def this(bundle: Bundle) = this(bundle, None)
|
||||
|
||||
protected override def findClass(name: String): Class[_] = bundle.loadClass(name)
|
||||
|
||||
protected override def findResource(name: String): URL =
|
||||
bundle.getResource(name) match {
|
||||
case null if classLoader.isDefined ⇒ classLoader.get.getResource(name)
|
||||
case result ⇒ result
|
||||
}
|
||||
|
||||
@SuppressWarnings(Array("unchecked", "rawtypes"))
|
||||
protected override def findResources(name: String): Enumeration[URL] =
|
||||
bundle.getResources(name).asInstanceOf[Enumeration[URL]]
|
||||
|
||||
protected override def loadClass(name: String, resolve: Boolean): Class[_] = {
|
||||
val clazz = try {
|
||||
try findClass(name) catch { case _: ClassNotFoundException if classLoader.isDefined ⇒ classLoader.get.loadClass(name) } // First fall back to secondary loader
|
||||
} catch {
|
||||
case cnfe: ClassNotFoundException ⇒
|
||||
throw new ClassNotFoundException("%s from bundle %s (%s)".format(name, bundle.getBundleId, bundle.getSymbolicName), cnfe) // IF we have no secondary loader or that failed as well, wrap and rethrow
|
||||
}
|
||||
|
||||
if (resolve)
|
||||
resolveClass(clazz)
|
||||
|
||||
clazz
|
||||
}
|
||||
|
||||
override val toString: String = "BundleDelegatingClassLoader(%s)".format(bundle.getBundleId)
|
||||
}
|
||||
|
||||
|
|
@ -41,7 +41,7 @@ class PingPongActorSystemActivatorTest extends WordSpec with Matchers with PojoS
|
|||
"start and register the ActorSystem when bundle starts" in {
|
||||
filterErrors() {
|
||||
val system = serviceForType[ActorSystem]
|
||||
val actor = system.actorFor("/user/pong")
|
||||
val actor = system.actorSelection("/user/pong")
|
||||
|
||||
implicit val timeout = Timeout(5 seconds)
|
||||
Await.result(actor ? Ping, timeout.duration) should be(Pong)
|
||||
|
|
|
|||
|
|
@ -70,18 +70,18 @@ trait PojoSRTestSupport extends Suite with BeforeAndAfterAll {
|
|||
def serviceForType[T](implicit t: ClassTag[T]): T =
|
||||
context.getService(awaitReference(t.runtimeClass)).asInstanceOf[T]
|
||||
|
||||
def awaitReference(serviceType: Class[_]): ServiceReference = awaitReference(serviceType, SleepyTime)
|
||||
def awaitReference[T](serviceType: Class[T]): ServiceReference[T] = awaitReference(serviceType, SleepyTime)
|
||||
|
||||
def awaitReference(serviceType: Class[_], wait: FiniteDuration): ServiceReference = {
|
||||
def awaitReference[T](serviceType: Class[T], wait: FiniteDuration): ServiceReference[T] = {
|
||||
|
||||
@tailrec def poll(step: Duration, deadline: Deadline): ServiceReference = context.getServiceReference(serviceType.getName) match {
|
||||
@tailrec def poll(step: Duration, deadline: Deadline): ServiceReference[T] = context.getServiceReference(serviceType.getName) match {
|
||||
case null ⇒
|
||||
if (deadline.isOverdue()) fail("Gave up waiting for service of type %s".format(serviceType))
|
||||
else {
|
||||
Thread.sleep((step min deadline.timeLeft max Duration.Zero).toMillis)
|
||||
poll(step, deadline)
|
||||
}
|
||||
case some ⇒ some
|
||||
case some ⇒ some.asInstanceOf[ServiceReference[T]]
|
||||
}
|
||||
|
||||
poll(wait, Deadline.now + MaxWaitDuration)
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ import akka.actor.ActorRef
|
|||
/*
|
||||
* Define our messages, they basically speak for themselves
|
||||
*/
|
||||
sealed trait DiningHakkerMessage
|
||||
sealed trait DiningHakkerMessage extends Serializable
|
||||
|
||||
case class Busy(chopstick: ActorRef) extends DiningHakkerMessage
|
||||
|
||||
|
|
@ -18,10 +18,20 @@ case class Take(hakker: ActorRef) extends DiningHakkerMessage
|
|||
|
||||
case class Taken(chopstick: ActorRef) extends DiningHakkerMessage
|
||||
|
||||
object Eat extends DiningHakkerMessage
|
||||
case object Eat extends DiningHakkerMessage
|
||||
|
||||
object Think extends DiningHakkerMessage
|
||||
case object Think extends DiningHakkerMessage
|
||||
|
||||
object Identify extends DiningHakkerMessage
|
||||
case object Identify extends DiningHakkerMessage
|
||||
|
||||
case class Identification(name: String, busyWith: String) extends DiningHakkerMessage
|
||||
|
||||
case object SubscribeToHakkerStateChanges extends DiningHakkerMessage
|
||||
|
||||
case class HakkerStateChange(hakkerName: String, from: String, to: String)
|
||||
|
||||
case class TrackHakker(hakker: ActorRef) extends DiningHakkerMessage
|
||||
|
||||
case class GetEatingCount(hakkerName: String) extends DiningHakkerMessage
|
||||
|
||||
case class EatingCount(hakkerName: String, count: Int) extends DiningHakkerMessage
|
||||
|
|
|
|||
|
|
@ -19,4 +19,5 @@ import akka.actor.ActorRef
|
|||
|
||||
trait DiningHakkersService {
|
||||
def getHakker(name: String, chairNumber: Int): ActorRef
|
||||
def getTracker(): ActorRef
|
||||
}
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
|
||||
<artifactId>akka-sample-osgi-dining-hakkers-dist</artifactId>
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
|
||||
<artifactId>akka-sample-osgi-dining-hakkers</artifactId>
|
||||
|
|
|
|||
|
|
@ -7,9 +7,10 @@
|
|||
|
||||
<feature name='scala' description='Scala' version='${scala.version}' resolver='(obr)'>
|
||||
<bundle start-level="15">mvn:org.scala-lang/scala-library/${scala.version}</bundle>
|
||||
<bundle start-level="15">mvn:org.scala-lang/scala-reflect/${scala.version}</bundle>
|
||||
</feature>
|
||||
|
||||
<feature name='uncommons-maths' description='Uncommons Maths' version='1.2.2' resolver='(obr)'>
|
||||
<feature name='uncommons-maths' description='Uncommons Maths' version='${uncommons-maths.version}' resolver='(obr)'>
|
||||
<!-- TODO replace this with upstream uncommons maths 1.2.3 which is OSGi enabled -->
|
||||
<bundle start-level="20">mvn:com.typesafe.akka.akka-sample-osgi-dining-hakkers/uncommons/1.2.2</bundle>
|
||||
</feature>
|
||||
|
|
@ -22,24 +23,32 @@
|
|||
<bundle start-level="20">mvn:io.netty/netty/${netty.version}</bundle>
|
||||
</feature>
|
||||
|
||||
<feature name='leveldb' description='LevelDB feature' version='${leveldb.version}' resolver='(obr)'>
|
||||
<bundle start-level="20">wrap:mvn:org.iq80.leveldb/leveldb/${leveldb.version}</bundle>
|
||||
<bundle start-level="20">wrap:mvn:org.fusesource.leveldbjni/leveldbjni-all/${leveldbjni.version}</bundle>
|
||||
</feature>
|
||||
|
||||
<feature name='typesafe-config' description='Typesafe config' version='${typesafe.config.version}' resolver='(obr)'>
|
||||
<feature version="[${scala.version},2.11.0)">scala</feature>
|
||||
<bundle start-level="25">mvn:com.typesafe/config/${typesafe.config.version}</bundle>
|
||||
</feature>
|
||||
|
||||
<feature name='akka' description='Akka' version='${akka.version}' resolver='(obr)'>
|
||||
<feature version="[${scala.version},2.11.0)">scala</feature>
|
||||
<feature version="[${netty.version},4.0.0)">netty</feature>
|
||||
<feature version="[1.2.2,3.0.0)">uncommons-maths</feature>
|
||||
<feature version="[${protobuf.version},3.0.0)">protobuf</feature>
|
||||
<feature version="[${typesafe.config.version},2.0.0)">typesafe-config</feature>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-cluster_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-remote_${scala.dep.version}/${akka.version}</bundle>
|
||||
<feature version="${scala.version}">scala</feature>
|
||||
<feature version="${netty.version}">netty</feature>
|
||||
<feature version="${uncommons-maths.version}">uncommons-maths</feature>
|
||||
<feature version="${protobuf.version}">protobuf</feature>
|
||||
<feature version="${typesafe.config.version}">typesafe-config</feature>
|
||||
<feature version="${leveldb.version}">leveldb</feature>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-actor_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-osgi_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-remote_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-cluster_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-persistence-experimental_${scala.dep.version}/${akka.version}</bundle>
|
||||
<bundle start-level="30">mvn:com.typesafe.akka/akka-slf4j_${scala.dep.version}/${akka.version}</bundle>
|
||||
</feature>
|
||||
|
||||
<feature name='dining-hakker' description='Akka Dining Hakker Sample' version='${project.version}' resolver='(obr)'>
|
||||
<feature version="[${akka.version},3.0.0)">akka</feature>
|
||||
<feature version="${akka.version}">akka</feature>
|
||||
<bundle start-level="50">mvn:com.typesafe.akka.akka-sample-osgi-dining-hakkers/api/${project.version}</bundle>
|
||||
<bundle start-level="50">mvn:com.typesafe.akka.akka-sample-osgi-dining-hakkers/core/${project.version}</bundle>
|
||||
<bundle start-level="50">mvn:com.typesafe.akka.akka-sample-osgi-dining-hakkers/command/${project.version}</bundle>
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
|
|
|
|||
|
|
@ -32,7 +32,7 @@ class Activator extends BundleActivator {
|
|||
}
|
||||
|
||||
def startHakker(service: DiningHakkersService, name: String) {
|
||||
hakker = Some(service.getHakker(name, (math.floor(math.random * 5)).toInt))
|
||||
hakker = Some(service.getHakker(name, 4))
|
||||
}
|
||||
|
||||
def stop(context: BundleContext) {
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
|
|
@ -30,6 +30,14 @@
|
|||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-remote_${scala.dep.version}</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-slf4j_${scala.dep.version}</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-persistence-experimental_${scala.dep.version}</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe</groupId>
|
||||
<artifactId>config</artifactId>
|
||||
|
|
@ -39,14 +47,6 @@
|
|||
<artifactId>api</artifactId>
|
||||
</dependency>
|
||||
|
||||
<!--Test dependencies-->
|
||||
<!-- <dependency>
|
||||
<groupId>org.specs2</groupId>
|
||||
<artifactId>specs2_${scala.version}</artifactId>
|
||||
<version>1.12.3</version>
|
||||
<scope>test</scope>
|
||||
</dependency>-->
|
||||
|
||||
<!--remoting dependencies-->
|
||||
<dependency>
|
||||
<groupId>io.netty</groupId>
|
||||
|
|
@ -61,9 +61,12 @@
|
|||
<artifactId>akka-cluster_${scala.dep.version}</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.fusesource</groupId>
|
||||
<artifactId>sigar</artifactId>
|
||||
<version>1.6.4</version>
|
||||
<groupId>org.iq80.leveldb</groupId>
|
||||
<artifactId>leveldb</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.fusesource.leveldbjni</groupId>
|
||||
<artifactId>leveldbjni-all</artifactId>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
<build>
|
||||
|
|
|
|||
|
|
@ -1,7 +1,19 @@
|
|||
akka {
|
||||
|
||||
loggers = ["akka.event.slf4j.Slf4jLogger", "akka.event.Logging$DefaultLogger"]
|
||||
|
||||
actor {
|
||||
provider = "akka.cluster.ClusterActorRefProvider"
|
||||
|
||||
serialize-messages = on
|
||||
|
||||
serializers {
|
||||
dining = "akka.sample.osgi.serialization.DiningHakkerSerializer"
|
||||
}
|
||||
|
||||
serialization-bindings {
|
||||
"akka.sample.osgi.api.DiningHakkerMessage" = dining
|
||||
}
|
||||
}
|
||||
|
||||
remote {
|
||||
|
|
@ -13,7 +25,7 @@ akka {
|
|||
|
||||
cluster {
|
||||
seed-nodes = ["akka.tcp://akka-osgi-sample@localhost:4242"]
|
||||
auto-down = on
|
||||
auto-down-unreachable-after = 20 s
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
|||
|
|
@ -22,13 +22,12 @@ import akka.sample.osgi.service.DiningHakkersServiceImpl
|
|||
import akka.sample.osgi.api.DiningHakkersService
|
||||
import akka.event.{ LogSource, Logging }
|
||||
import org.osgi.framework.{ ServiceRegistration, BundleContext }
|
||||
import scala.collection.mutable.ListBuffer
|
||||
|
||||
class Activator extends ActorSystemActivator {
|
||||
|
||||
import Activator._
|
||||
|
||||
var diningHakkerService: Option[ServiceRegistration] = None
|
||||
var diningHakkerService: Option[ServiceRegistration[_]] = None
|
||||
|
||||
def configure(context: BundleContext, system: ActorSystem) {
|
||||
val log = Logging(system, this)
|
||||
|
|
|
|||
|
|
@ -4,12 +4,15 @@
|
|||
package akka.sample.osgi.internal
|
||||
|
||||
import language.postfixOps
|
||||
import scala.concurrent.duration._
|
||||
import akka.actor.Terminated
|
||||
import akka.cluster.Cluster
|
||||
import akka.cluster.ClusterEvent.{ CurrentClusterState, LeaderChanged }
|
||||
import akka.event.Logging
|
||||
import scala.concurrent.duration._
|
||||
import akka.sample.osgi.api._
|
||||
import akka.actor.{ RootActorPath, Address, ActorRef, Actor }
|
||||
import akka.sample.osgi.api.SubscribeToHakkerStateChanges
|
||||
import akka.sample.osgi.api.HakkerStateChange
|
||||
|
||||
//Akka adaptation of
|
||||
//http://www.dalnefre.com/wp/2010/08/dining-philosophers-in-humus/
|
||||
|
|
@ -68,11 +71,14 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
cluster.unsubscribe(self)
|
||||
}
|
||||
|
||||
var subscribers = Set.empty[ActorRef]
|
||||
|
||||
//When a hakker is thinking it can become hungry
|
||||
//and try to pick up its chopsticks and eat
|
||||
def thinking(left: ActorRef, right: ActorRef): Receive = {
|
||||
case Eat =>
|
||||
become(hungry(left, right) orElse (clusterEvents))
|
||||
pubStateChange("thinking", "hungry")
|
||||
become(hungry(left, right) orElse (managementEvents))
|
||||
left ! Take(self)
|
||||
right ! Take(self)
|
||||
case Identify => identify("Thinking")
|
||||
|
|
@ -84,11 +90,14 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
//it starts to wait for the response of the other grab
|
||||
def hungry(left: ActorRef, right: ActorRef): Receive = {
|
||||
case Taken(`left`) =>
|
||||
become(waiting_for(left, right, false) orElse (clusterEvents))
|
||||
pubStateChange("hungry", "waiting")
|
||||
become(waiting_for(left, right, false) orElse (managementEvents))
|
||||
case Taken(`right`) =>
|
||||
become(waiting_for(left, right, true) orElse (clusterEvents))
|
||||
pubStateChange("hungry", "waiting")
|
||||
become(waiting_for(left, right, true) orElse (managementEvents))
|
||||
case Busy(chopstick) =>
|
||||
become(denied_a_chopstick(left, right) orElse (clusterEvents))
|
||||
pubStateChange("hungry", "denied_a_chopstick")
|
||||
become(denied_a_chopstick(left, right) orElse (managementEvents))
|
||||
case Identify => identify("Hungry")
|
||||
}
|
||||
|
||||
|
|
@ -98,14 +107,17 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
def waiting_for(left: ActorRef, right: ActorRef, waitingForLeft: Boolean): Receive = {
|
||||
case Taken(`left`) if waitingForLeft =>
|
||||
log.info("%s has picked up %s and %s and starts to eat".format(name, left.path.name, right.path.name))
|
||||
become(eating(left, right) orElse (clusterEvents))
|
||||
pubStateChange("waiting", "eating")
|
||||
become(eating(left, right) orElse (managementEvents))
|
||||
system.scheduler.scheduleOnce(5 seconds, self, Think)
|
||||
case Taken(`right`) if !waitingForLeft =>
|
||||
log.info("%s has picked up %s and %s and starts to eat".format(name, left.path.name, right.path.name))
|
||||
become(eating(left, right) orElse (clusterEvents))
|
||||
pubStateChange("waiting", "eating")
|
||||
become(eating(left, right) orElse (managementEvents))
|
||||
system.scheduler.scheduleOnce(5 seconds, self, Think)
|
||||
case Busy(chopstick) =>
|
||||
become(thinking(left, right) orElse (clusterEvents))
|
||||
pubStateChange("waiting", "thinking")
|
||||
become(thinking(left, right) orElse (managementEvents))
|
||||
if (waitingForLeft) {
|
||||
right ! Put(self)
|
||||
} else {
|
||||
|
|
@ -120,11 +132,13 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
//Then go back and think and try to grab the chopsticks again
|
||||
def denied_a_chopstick(left: ActorRef, right: ActorRef): Receive = {
|
||||
case Taken(chopstick) =>
|
||||
become(thinking(left, right) orElse (clusterEvents))
|
||||
pubStateChange("denied_a_chopstick", "thinking")
|
||||
become(thinking(left, right) orElse (managementEvents))
|
||||
chopstick ! Put(self)
|
||||
self ! Eat
|
||||
case Busy(chopstick) =>
|
||||
become(thinking(left, right) orElse (clusterEvents))
|
||||
pubStateChange("denied_a_chopstick", "thinking")
|
||||
become(thinking(left, right) orElse (managementEvents))
|
||||
self ! Eat
|
||||
case Identify => identify("Denied a Chopstick")
|
||||
}
|
||||
|
|
@ -133,7 +147,8 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
//then he puts down his chopsticks and starts to think
|
||||
def eating(left: ActorRef, right: ActorRef): Receive = {
|
||||
case Think =>
|
||||
become(thinking(left, right) orElse (clusterEvents))
|
||||
pubStateChange("eating", "thinking")
|
||||
become(thinking(left, right) orElse (managementEvents))
|
||||
left ! Put(self)
|
||||
right ! Put(self)
|
||||
log.info("%s puts down his chopsticks and starts to think".format(name))
|
||||
|
|
@ -143,25 +158,42 @@ class Hakker(name: String, chair: Int) extends Actor {
|
|||
|
||||
def waitForChopsticks: Receive = {
|
||||
case (left: ActorRef, right: ActorRef) =>
|
||||
become(thinking(left, right) orElse (clusterEvents))
|
||||
pubStateChange("waiting", "thinking")
|
||||
become(thinking(left, right) orElse managementEvents)
|
||||
system.scheduler.scheduleOnce(5 seconds, self, Eat)
|
||||
case Identify => identify("Waiting")
|
||||
}
|
||||
|
||||
def clusterEvents: Receive = {
|
||||
def managementEvents: Receive = {
|
||||
case state: CurrentClusterState => state.leader foreach updateTable
|
||||
case LeaderChanged(Some(leaderAddress)) => updateTable(leaderAddress)
|
||||
case SubscribeToHakkerStateChanges =>
|
||||
subscribers += sender
|
||||
context watch sender
|
||||
case Terminated(subscriber) =>
|
||||
subscribers -= subscriber
|
||||
}
|
||||
|
||||
def identify(busyWith: String) {
|
||||
def initializing: Receive = {
|
||||
case Identify => identify("Initializing")
|
||||
}
|
||||
|
||||
def identify(busyWith: String): Unit = {
|
||||
sender() ! Identification(name, busyWith)
|
||||
}
|
||||
|
||||
def updateTable(leaderAdress: Address) {
|
||||
become(waitForChopsticks)
|
||||
context.actorFor(RootActorPath(leaderAdress) / "user" / "table") ! chair
|
||||
def updateTable(leaderAdress: Address): Unit = {
|
||||
pubStateChange("-", "waiting")
|
||||
become(waitForChopsticks orElse managementEvents)
|
||||
context.actorSelection(RootActorPath(leaderAdress) / "user" / "table") ! chair
|
||||
}
|
||||
|
||||
//All hakkers start in a non-eating state
|
||||
def receive = clusterEvents
|
||||
def receive = initializing orElse managementEvents
|
||||
|
||||
def pubStateChange(from: String, to: String): Unit = {
|
||||
val chg = HakkerStateChange(name, from, to)
|
||||
subscribers foreach { _ ! chg }
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,58 @@
|
|||
package akka.sample.osgi.internal
|
||||
|
||||
import akka.persistence.EventsourcedProcessor
|
||||
import akka.actor.ActorRef
|
||||
import akka.sample.osgi.api.HakkerStateChange
|
||||
import akka.sample.osgi.api.SubscribeToHakkerStateChanges
|
||||
import akka.sample.osgi.api.EatingCount
|
||||
import akka.sample.osgi.api.GetEatingCount
|
||||
import akka.sample.osgi.api.TrackHakker
|
||||
|
||||
object HakkerTracker {
|
||||
sealed trait DomainEvent
|
||||
case class StartedEating(name: String) extends DomainEvent
|
||||
case class StoppedEating(name: String) extends DomainEvent
|
||||
|
||||
object State {
|
||||
val empty: State = new State(Map.empty)
|
||||
}
|
||||
case class State private (eatingCounts: Map[String, Int]) {
|
||||
def updated(event: DomainEvent): State = event match {
|
||||
case StartedEating(name) =>
|
||||
val c = eatingCounts.getOrElse(name, 0) + 1
|
||||
copy(eatingCounts = eatingCounts + (name -> c))
|
||||
case StoppedEating(name) =>
|
||||
this
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class HakkerTracker extends EventsourcedProcessor {
|
||||
import HakkerTracker._
|
||||
|
||||
var state = State.empty
|
||||
|
||||
override def receiveRecover: Receive = {
|
||||
case evt: DomainEvent =>
|
||||
state = state.updated(evt)
|
||||
}
|
||||
|
||||
override def receiveCommand: Receive = {
|
||||
case TrackHakker(hakker) =>
|
||||
hakker ! SubscribeToHakkerStateChanges
|
||||
|
||||
case HakkerStateChange(name, _, "eating") =>
|
||||
persist(StartedEating(name)) { evt =>
|
||||
state = state.updated(evt)
|
||||
}
|
||||
|
||||
case HakkerStateChange(name, "eating", _) =>
|
||||
persist(StoppedEating(name)) { evt =>
|
||||
state = state.updated(evt)
|
||||
}
|
||||
|
||||
case GetEatingCount(name) =>
|
||||
sender ! EatingCount(name, 17)
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -0,0 +1,25 @@
|
|||
package akka.sample.osgi.serialization
|
||||
|
||||
import akka.serialization.Serializer
|
||||
import akka.actor.ExtendedActorSystem
|
||||
import akka.serialization.Serialization
|
||||
import akka.serialization.SerializationExtension
|
||||
|
||||
class DiningHakkerSerializer(val system: ExtendedActorSystem) extends Serializer {
|
||||
|
||||
override def includeManifest: Boolean = true
|
||||
|
||||
override def identifier = 98765
|
||||
|
||||
lazy val javaSerializer = SerializationExtension(system).findSerializerFor(classOf[java.io.Serializable])
|
||||
|
||||
def toBinary(obj: AnyRef): Array[Byte] = {
|
||||
javaSerializer.toBinary(obj)
|
||||
}
|
||||
|
||||
def fromBinary(bytes: Array[Byte],
|
||||
clazz: Option[Class[_]]): AnyRef = {
|
||||
javaSerializer.fromBinary(bytes, clazz)
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -17,10 +17,14 @@ package akka.sample.osgi.service
|
|||
|
||||
import akka.sample.osgi.api.DiningHakkersService
|
||||
import akka.actor.{ Props, ActorSystem }
|
||||
import akka.actor.ActorRef
|
||||
import akka.sample.osgi.internal.Hakker
|
||||
import akka.sample.osgi.internal.HakkerTracker
|
||||
|
||||
class DiningHakkersServiceImpl(system: ActorSystem) extends DiningHakkersService {
|
||||
def getHakker(name: String, chairNumber: Int) = {
|
||||
def getHakker(name: String, chairNumber: Int): ActorRef =
|
||||
system.actorOf(Props(classOf[Hakker], name, chairNumber))
|
||||
}
|
||||
|
||||
def getTracker(): ActorRef =
|
||||
system.actorOf(Props[HakkerTracker], "tracker")
|
||||
}
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
|
|
@ -34,92 +34,10 @@
|
|||
<groupId>org.scalatest</groupId>
|
||||
<artifactId>scalatest_${scala.dep.version}</artifactId>
|
||||
</dependency>
|
||||
<!--
|
||||
This is needed only for the scalatest osgi bundle, it has a non-optional import on scala-actors stuff.
|
||||
This can be removed once we are using ScalaTest 2.0.x: https://groups.google.com/d/topic/scalatest-users/XAVWBJ-v6vA/discussion
|
||||
See also TestOptions.testBundles
|
||||
-->
|
||||
<dependency>
|
||||
<groupId>org.scala-lang</groupId>
|
||||
<artifactId>scala-actors</artifactId>
|
||||
<version>${scala.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>junit</groupId>
|
||||
<artifactId>junit</artifactId>
|
||||
</dependency>
|
||||
|
||||
<!--
|
||||
Explicitly import Pax Exam 2.6.0, should be able to remove this once Karaf 2.3.1 is released
|
||||
(http://www.mail-archive.com/user@karaf.apache.org/msg04061.html)
|
||||
-->
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-container-remote</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-invoker-junit</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-container-rbc</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-container-rbc-client</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-extender-service</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.exam</groupId>
|
||||
<artifactId>pax-exam-inject</artifactId>
|
||||
<version>${paxexam.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.swissbox</groupId>
|
||||
<artifactId>pax-swissbox-core</artifactId>
|
||||
<version>${paxswissbox.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.swissbox</groupId>
|
||||
<artifactId>pax-swissbox-extender</artifactId>
|
||||
<version>${paxswissbox.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.swissbox</groupId>
|
||||
<artifactId>pax-swissbox-lifecycle</artifactId>
|
||||
<version>${paxswissbox.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.ops4j.pax.swissbox</groupId>
|
||||
<artifactId>pax-swissbox-framework</artifactId>
|
||||
<version>${paxswissbox.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<!-- End of temporary Pax Exam dependency imports -->
|
||||
</dependencies>
|
||||
|
||||
<build>
|
||||
|
|
@ -147,7 +65,6 @@
|
|||
<pax.exam.service.timeout>30000</pax.exam.service.timeout>
|
||||
<karaf.version>${karaf.version}</karaf.version>
|
||||
<project.version>${project.version}</project.version>
|
||||
<!--<scalatest.artifact>scalatest_${scala.dep.version}</scalatest.artifact>-->
|
||||
<scala.dep.version>${scala.dep.version}</scala.dep.version>
|
||||
</systemPropertyVariables>
|
||||
</configuration>
|
||||
|
|
|
|||
|
|
@ -1,19 +1,19 @@
|
|||
package akka.sample.osgi.test
|
||||
|
||||
import akka.actor._
|
||||
import akka.sample.osgi.api.{DiningHakkersService, Identify, Identification}
|
||||
import akka.actor.{ Identify => _, _ }
|
||||
import akka.sample.osgi.api._
|
||||
import akka.sample.osgi.test.TestOptions._
|
||||
import org.junit.runner.RunWith
|
||||
import org.junit.{Before, Test}
|
||||
import org.ops4j.pax.exam.{Option => PaxOption}
|
||||
import org.ops4j.pax.exam.junit.{Configuration, JUnit4TestRunner}
|
||||
import org.ops4j.pax.exam.util.Filter
|
||||
import org.scalatest.junit.JUnitSuite
|
||||
import org.scalatest.junit.ShouldMatchersForJUnit
|
||||
import javax.inject.Inject
|
||||
import java.util.concurrent.{TimeUnit, SynchronousQueue}
|
||||
import akka.testkit.TestProbe
|
||||
import scala.Some
|
||||
import javax.inject.Inject
|
||||
import org.junit.runner.RunWith
|
||||
import org.junit.{ Before, Test }
|
||||
import org.ops4j.pax.exam.junit.{ Configuration, JUnit4TestRunner }
|
||||
import org.ops4j.pax.exam.util.Filter
|
||||
import org.ops4j.pax.exam.{ Option => PaxOption }
|
||||
import org.scalatest.junit.{AssertionsForJUnit, JUnitSuite}
|
||||
import org.scalatest.Matchers
|
||||
import scala.concurrent.duration._
|
||||
import org.apache.karaf.tooling.exam.options.LogLevelOption
|
||||
|
||||
/**
|
||||
* This is a ScalaTest based integration test. Pax-Exam, which is responsible for loading the test class into
|
||||
|
|
@ -30,7 +30,7 @@ import scala.Some
|
|||
* TODO attempt to use the Akka test probe
|
||||
*/
|
||||
@RunWith(classOf[JUnit4TestRunner])
|
||||
class HakkerStatusTest extends JUnitSuite with MatchersForJUnit {
|
||||
class HakkerStatusTest extends JUnitSuite with Matchers with AssertionsForJUnit {
|
||||
|
||||
@Inject @Filter(timeout = 30000)
|
||||
var actorSystem: ActorSystem = _
|
||||
|
|
@ -43,66 +43,52 @@ class HakkerStatusTest extends JUnitSuite with MatchersForJUnit {
|
|||
@Configuration
|
||||
def config: Array[PaxOption] = Array[PaxOption](
|
||||
karafOptionsWithTestBundles(),
|
||||
featureDiningHakkers()
|
||||
//,debugOptions()
|
||||
)
|
||||
featureDiningHakkers() //, debugOptions(level = LogLevelOption.LogLevel.DEBUG)
|
||||
)
|
||||
|
||||
// Junit @Before and @After can be used as well
|
||||
|
||||
/* @Before
|
||||
@Before
|
||||
def setupAkkaTestkit() {
|
||||
testProbe = new TestProbe(actorSystem)
|
||||
}*/
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
def verifyObtainingAHakkerViaTheTheDiningHakkersService() {
|
||||
|
||||
val name = "TestHakker"
|
||||
val hakker = Some(service.getHakker(name, (math.floor(math.random * 5)).toInt))
|
||||
val hakker = Option(service.getHakker(name, 2))
|
||||
.getOrElse(throw new IllegalStateException("No Hakker was created via DiningHakkerService"))
|
||||
|
||||
hakker should not be (null)
|
||||
// takes some time for the first message to get through
|
||||
testProbe.within(10.seconds) {
|
||||
testProbe.send(hakker, Identify)
|
||||
val Identification(fromHakker, busyWith) = testProbe.expectMsgType[Identification]
|
||||
|
||||
/* TODO Getting some weird config error with TestProbe, is it a TestProbe inside OSGi issue?
|
||||
Exception in thread "RMI TCP Connection(idle)" java.lang.NullPointerException
|
||||
at com.typesafe.config.impl.SerializedConfigValue.writeOrigin(SerializedConfigValue.java:202)
|
||||
at com.typesafe.config.impl.ConfigImplUtil.writeOrigin(ConfigImplUtil.java:224)
|
||||
at com.typesafe.config.ConfigException.writeObject(ConfigException.java:58)
|
||||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||||
|
||||
java.io.EOFException
|
||||
at java.io.ObjectInputStream$BlockDataInputStream.readUnsignedByte(ObjectInputStream.java:2747)
|
||||
at java.io.ObjectInputStream.readUnsignedByte(ObjectInputStream.java:924)
|
||||
at com.typesafe.config.impl.SerializedConfigValue.readCode(SerializedConfigValue.java:412)
|
||||
at com.typesafe.config.impl.SerializedConfigValue.readOrigin(SerializedConfigValue.java:218)
|
||||
...
|
||||
testProbe.send(hakker, Identify)
|
||||
val identification = testProbe.expectMsgClass(classOf[Identification])
|
||||
val (fromHakker, busyWith) = (identification.name, identification.busyWith)
|
||||
*/
|
||||
|
||||
// create an "Interrogator" actor that receives a Hakker's identification
|
||||
// this is to work around the TestProbe Exception above
|
||||
val response = new SynchronousQueue[(String, String)]()
|
||||
|
||||
hakker.tell(Identify, actorSystem.actorOf(Props(classOf[HakkerStatusTest.Interrogator], response), "Interrogator"))
|
||||
val (fromHakker, busyWith) = response.poll(5, TimeUnit.SECONDS)
|
||||
|
||||
println("---------------> %s is busy with %s.".format(fromHakker, busyWith))
|
||||
fromHakker should be ("TestHakker")
|
||||
busyWith should not be (null)
|
||||
println("---------------> %s is busy with %s.".format(fromHakker, busyWith))
|
||||
fromHakker should be("TestHakker")
|
||||
busyWith should not be (null)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
@Test
|
||||
def verifyHakkerTracker() {
|
||||
|
||||
object HakkerStatusTest {
|
||||
class Interrogator(queue: SynchronousQueue[(String, String)]) extends Actor {
|
||||
def receive = {
|
||||
case msg: Identification => {
|
||||
queue.put((msg.name, msg.busyWith))
|
||||
val name = "TestHakker"
|
||||
val hakker = service.getHakker(name, 3)
|
||||
val tracker = service.getTracker()
|
||||
tracker ! TrackHakker(hakker)
|
||||
testProbe.within(10.seconds) {
|
||||
testProbe.awaitAssert {
|
||||
testProbe.within(1.second) {
|
||||
tracker.tell(GetEatingCount(name), testProbe.ref)
|
||||
val reply = testProbe.expectMsgType[EatingCount]
|
||||
reply.hakkerName should be(name)
|
||||
reply.count should be > (0)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ package akka.sample.osgi.test
|
|||
|
||||
import org.ops4j.pax.exam.CoreOptions._
|
||||
import org.ops4j.pax.exam.options.DefaultCompositeOption
|
||||
import org.ops4j.pax.exam.{Option => PaxOption}
|
||||
import org.ops4j.pax.exam.{ Option => PaxOption }
|
||||
import org.apache.karaf.tooling.exam.options.LogLevelOption
|
||||
import org.apache.karaf.tooling.exam.options.KarafDistributionOption._
|
||||
import java.io.File
|
||||
|
|
@ -20,34 +20,26 @@ object TestOptions {
|
|||
.karafVersion(System.getProperty("karaf.version")).name("Apache Karaf").useDeployFolder(useDeployFolder)
|
||||
|
||||
new DefaultCompositeOption(if (extractInTargetFolder) kdc.unpackDirectory(new File("target/paxexam/unpack/")) else kdc,
|
||||
editConfigurationFilePut("etc/config.properties", "karaf.framework", "equinox")
|
||||
)
|
||||
editConfigurationFilePut("etc/config.properties", "karaf.framework", "equinox"))
|
||||
}
|
||||
|
||||
def testBundles(): PaxOption = {
|
||||
new DefaultCompositeOption(
|
||||
mavenBundle("com.typesafe.akka", "akka-testkit_%s".format(scalaDepVersion)).versionAsInProject,
|
||||
mavenBundle("org.scalatest", "scalatest_%s".format(scalaDepVersion)).versionAsInProject,
|
||||
// This is needed for the scalatest osgi bundle, it has a non-optional import on a scala-actors package
|
||||
mavenBundle("org.scala-lang", "scala-actors").versionAsInProject,
|
||||
junitBundles
|
||||
)
|
||||
junitBundles)
|
||||
}
|
||||
|
||||
def debugOptions(level: LogLevelOption.LogLevel = LogLevelOption.LogLevel.INFO, debugPort: Int= 5005): PaxOption = {
|
||||
new DefaultCompositeOption(
|
||||
logLevel(level),
|
||||
debugConfiguration(String.valueOf(debugPort), true),
|
||||
configureConsole().startLocalConsole(),
|
||||
configureConsole().startRemoteShell()
|
||||
)
|
||||
def debugOptions(level: LogLevelOption.LogLevel = LogLevelOption.LogLevel.INFO, debugPort: Option[Int] = None): PaxOption = {
|
||||
val options: List[PaxOption] = List(logLevel(level), configureConsole().startLocalConsole(), configureConsole().startRemoteShell()) ++
|
||||
debugPort.toList.map(p => debugConfiguration(String.valueOf(p), true))
|
||||
new DefaultCompositeOption(options: _*)
|
||||
}
|
||||
|
||||
def karafOptionsWithTestBundles(useDeployFolder: Boolean = false, extractInTargetFolder: Boolean = true): PaxOption = {
|
||||
new DefaultCompositeOption(
|
||||
karafOptions(useDeployFolder, extractInTargetFolder),
|
||||
testBundles()
|
||||
)
|
||||
testBundles())
|
||||
}
|
||||
|
||||
def featureDiningHakkers(): PaxOption = {
|
||||
|
|
@ -56,7 +48,7 @@ object TestOptions {
|
|||
|
||||
def akkaFeature(feature: String): PaxOption = {
|
||||
scanFeatures(maven.groupId("com.typesafe.akka.akka-sample-osgi-dining-hakkers")
|
||||
.artifactId("akka-sample-osgi-dining-hakkers") .`type`("xml").classifier("features")
|
||||
.artifactId("akka-sample-osgi-dining-hakkers").`type`("xml").classifier("features")
|
||||
.version(System.getProperty("project.version")), feature)
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,16 +1,18 @@
|
|||
#!/bin/bash
|
||||
|
||||
projdir=$(cd $(dirname $0); pwd)
|
||||
version=2.2.0-SNAPSHOT
|
||||
version=2.3-SNAPSHOT
|
||||
|
||||
# This directory is specified in the build as the root of the tar
|
||||
# Use tar --strip-components=1 to ignore the root
|
||||
outputdir="$projdir/akka-osgi-sample-$version"
|
||||
outputdir="$projdir/target/akka-sample-osgi-dining-hakkers-$version"
|
||||
|
||||
mkdir $projdir/target
|
||||
|
||||
if [[ -d "$outputdir" ]]; then
|
||||
echo Deleting existing $outputdir...
|
||||
rm -fr "$outputdir"
|
||||
fi
|
||||
echo Extracting configured container into $outputdir...
|
||||
tar -C $projdir -zxf assembly-dist/target/assembly-dist-$version.tar.gz
|
||||
tar -C $projdir/target -zxf assembly-dist/target/akka-sample-osgi-dining-hakkers-dist-$version.tar.gz
|
||||
echo Extract complete, please run $outputdir/bin/karaf
|
||||
|
|
|
|||
|
|
@ -1,2 +0,0 @@
|
|||
#!/bin/bash
|
||||
./apache-karaf-2.3.0/bin/karaf
|
||||
|
|
@ -5,22 +5,25 @@
|
|||
<modelVersion>4.0.0</modelVersion>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
|
||||
<properties>
|
||||
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||
<akka.version>2.3-SNAPSHOT</akka.version>
|
||||
<karaf.version>2.3.0</karaf.version>
|
||||
<karaf.version>2.3.1</karaf.version>
|
||||
<karaf.tooling.exam.version>${karaf.version}</karaf.tooling.exam.version>
|
||||
<netty.version>3.6.2.Final</netty.version>
|
||||
<osgi.version>4.2.0</osgi.version>
|
||||
<netty.version>3.8.0.Final</netty.version>
|
||||
<osgi.version>4.3.1</osgi.version>
|
||||
<paxexam.version>2.6.0</paxexam.version>
|
||||
<paxswissbox.version>1.6.0</paxswissbox.version>
|
||||
<protobuf.version>2.5.0</protobuf.version>
|
||||
<scala.version>2.10.2</scala.version>
|
||||
<scala.dep.version>2.10</scala.dep.version>
|
||||
<scalatest.version>1.9.1</scalatest.version>
|
||||
<typesafe.config.version>1.0.2</typesafe.config.version>
|
||||
<scalatest.version>2.0</scalatest.version>
|
||||
<typesafe.config.version>1.2.0</typesafe.config.version>
|
||||
<leveldb.version>0.5</leveldb.version>
|
||||
<leveldbjni.version>1.7</leveldbjni.version>
|
||||
<uncommons-maths.version>1.2.2</uncommons-maths.version>
|
||||
</properties>
|
||||
|
||||
<modules>
|
||||
|
|
@ -86,6 +89,16 @@
|
|||
<artifactId>akka-remote_${scala.dep.version}</artifactId>
|
||||
<version>${akka.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-slf4j_${scala.dep.version}</artifactId>
|
||||
<version>${akka.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe.akka</groupId>
|
||||
<artifactId>akka-persistence-experimental_${scala.dep.version}</artifactId>
|
||||
<version>${akka.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.typesafe</groupId>
|
||||
<artifactId>config</artifactId>
|
||||
|
|
@ -101,6 +114,16 @@
|
|||
<artifactId>netty</artifactId>
|
||||
<version>${netty.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.iq80.leveldb</groupId>
|
||||
<artifactId>leveldb</artifactId>
|
||||
<version>${leveldb.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.fusesource.leveldbjni</groupId>
|
||||
<artifactId>leveldbjni-all</artifactId>
|
||||
<version>${leveldbjni.version}</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.apache.karaf.tooling.exam</groupId>
|
||||
|
|
@ -129,7 +152,7 @@
|
|||
<dependency>
|
||||
<groupId>junit</groupId>
|
||||
<artifactId>junit</artifactId>
|
||||
<version>4.9</version>
|
||||
<version>4.10</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
|
|
@ -161,10 +184,6 @@
|
|||
<id>oss-sonatype-releases</id>
|
||||
<url>https://oss.sonatype.org/content/repositories/releases</url>
|
||||
</repository>
|
||||
<repository>
|
||||
<id>typesafe-snapshots</id>
|
||||
<url>http://repo.typesafe.com/typesafe/snapshots/</url>
|
||||
</repository>
|
||||
</repositories>
|
||||
|
||||
<build>
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@
|
|||
<parent>
|
||||
<groupId>com.typesafe.akka.akka-sample-osgi-dining-hakkers</groupId>
|
||||
<artifactId>project</artifactId>
|
||||
<version>2.2.0-SNAPSHOT</version>
|
||||
<version>2.3-SNAPSHOT</version>
|
||||
</parent>
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
|
|
|
|||
|
|
@ -57,7 +57,7 @@ object AkkaBuild extends Build {
|
|||
javacOptions in JavaDoc := Seq(),
|
||||
artifactName in packageDoc in JavaDoc := ((sv, mod, art) => "" + mod.name + "_" + sv.binary + "-" + mod.revision + "-javadoc.jar"),
|
||||
packageDoc in Compile <<= packageDoc in JavaDoc,
|
||||
Dist.distExclude := Seq(actorTests.id, docs.id, samples.id, osgi.id, osgiAries.id),
|
||||
Dist.distExclude := Seq(actorTests.id, docs.id, samples.id, osgi.id),
|
||||
// generate online version of docs
|
||||
sphinxInputs in Sphinx <<= sphinxInputs in Sphinx in LocalProject(docs.id) map { inputs => inputs.copy(tags = inputs.tags :+ "online") },
|
||||
// don't regenerate the pdf, just reuse the akka-docs version
|
||||
|
|
@ -75,7 +75,7 @@ object AkkaBuild extends Build {
|
|||
validatePullRequest <<= (Unidoc.unidoc, SphinxSupport.generate in Sphinx in docs) map { (_, _) => }
|
||||
),
|
||||
aggregate = Seq(actor, testkit, actorTests, dataflow, remote, remoteTests, camel, cluster, slf4j, agent, transactor,
|
||||
persistence, mailboxes, zeroMQ, kernel, osgi, osgiAries, docs, contrib, samples, multiNodeTestkit)
|
||||
persistence, mailboxes, zeroMQ, kernel, osgi, docs, contrib, samples, multiNodeTestkit)
|
||||
)
|
||||
|
||||
lazy val akkaScalaNightly = Project(
|
||||
|
|
@ -83,7 +83,7 @@ object AkkaBuild extends Build {
|
|||
base = file("akka-scala-nightly"),
|
||||
// remove dependencies that we have to build ourselves (Scala STM, ZeroMQ Scala Bindings)
|
||||
aggregate = Seq(actor, testkit, actorTests, dataflow, remote, remoteTests, camel, cluster, slf4j,
|
||||
persistence, mailboxes, kernel, osgi, osgiAries, contrib, samples, multiNodeTestkit)
|
||||
persistence, mailboxes, kernel, osgi, contrib, samples, multiNodeTestkit)
|
||||
)
|
||||
|
||||
// this detached pseudo-project is used for running the tests against a different Scala version than the one used for compilation
|
||||
|
|
@ -145,7 +145,7 @@ object AkkaBuild extends Build {
|
|||
lazy val actor = Project(
|
||||
id = "akka-actor",
|
||||
base = file("akka-actor"),
|
||||
settings = defaultSettings ++ formatSettings ++ scaladocSettings ++ javadocSettings ++ Seq(
|
||||
settings = defaultSettings ++ formatSettings ++ scaladocSettings ++ javadocSettings ++ OSGi.actor ++ Seq(
|
||||
// to fix scaladoc generation
|
||||
fullClasspath in doc in Compile <<= fullClasspath in Compile,
|
||||
libraryDependencies ++= Dependencies.actor,
|
||||
|
|
@ -352,77 +352,12 @@ object AkkaBuild extends Build {
|
|||
)
|
||||
)
|
||||
|
||||
|
||||
val ActorMakeOsgiConfiguration = TaskKey[Seq[File]]("actor-make-osgi-configuration", "Copy reference.conf from akka modules for akka-osgi")
|
||||
val ActorOsgiConfigurationReference = TaskKey[Seq[(File, String)]]("actor-osgi-configuration-reference", "The list of all configuration files to be bundled in an osgi bundle, as well as project name.")
|
||||
|
||||
import Project.Initialize
|
||||
/** This method uses a bit of advanced sbt initailizers to grab the normalized names and resource directories
|
||||
* from a set of projects, and then use this to create a mapping of (reference.conf to project name).
|
||||
*/
|
||||
def ActorOsgiConfigurationReferenceAction(projects: Seq[Project]): Initialize[Task[Seq[(File, String)]]] = {
|
||||
val directories: Initialize[Seq[File]] = projects.map(resourceDirectory in Compile in _).join
|
||||
val names: Initialize[Seq[String]] = projects.map(normalizedName in _).join
|
||||
directories zip names map { case (dirs, ns) =>
|
||||
for {
|
||||
(dir, project) <- dirs zip ns
|
||||
val conf = dir / "reference.conf"
|
||||
if conf.exists
|
||||
} yield conf -> project
|
||||
}
|
||||
}
|
||||
|
||||
/** This method is repsonsible for genreating a new typeasafe config reference.conf file for OSGi.
|
||||
* it copies all the files in the `includes` parameter, using the associated project name. Then
|
||||
* it generates a new resource.conf file which includes these files.
|
||||
*
|
||||
* @param target The location where we write the new files
|
||||
* @param includes A sequnece of (<reference.conf>, <project name>) pairs.
|
||||
*/
|
||||
def makeOsgiConfigurationFiles(includes: Seq[(File, String)], target: File, streams: TaskStreams): Seq[File] = {
|
||||
// First we copy all the files to their destination
|
||||
val toCopy =
|
||||
for {
|
||||
(file, project) <- includes
|
||||
val toFile = target / (project + ".conf")
|
||||
} yield file -> toFile
|
||||
IO.copy(toCopy)
|
||||
val copiedResourceFileLocations = toCopy.map(_._2)
|
||||
streams.log.debug("Copied OSGi resources: " + copiedResourceFileLocations.mkString("\n\t", "\n\t", "\n"))
|
||||
// Now we generate the new including conf file
|
||||
val newConf = target / "resource.conf"
|
||||
val confIncludes =
|
||||
for {
|
||||
(file, project) <- includes
|
||||
} yield "include \""+ project +".conf\""
|
||||
val writer = new PrintWriter(newConf)
|
||||
try writer.write(confIncludes mkString "\n")
|
||||
finally writer.close()
|
||||
streams.log.info("Copied OSGi resources.")
|
||||
newConf +: copiedResourceFileLocations
|
||||
}
|
||||
|
||||
lazy val osgi = Project(
|
||||
id = "akka-osgi",
|
||||
base = file("akka-osgi"),
|
||||
dependencies = Seq(actor),
|
||||
settings = defaultSettings ++ formatSettings ++ scaladocSettings ++ javadocSettings ++ OSGi.osgi ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgi,
|
||||
cleanFiles <+= baseDirectory { base => base / "src/main/resources" } ,
|
||||
ActorOsgiConfigurationReference <<= ActorOsgiConfigurationReferenceAction(projects.filter(p => !p.id.contains("test") && !p.id.contains("sample"))),
|
||||
ActorMakeOsgiConfiguration <<= (ActorOsgiConfigurationReference, resourceManaged in Compile, streams) map makeOsgiConfigurationFiles,
|
||||
resourceGenerators in Compile <+= ActorMakeOsgiConfiguration,
|
||||
parallelExecution in Test := false,
|
||||
reportBinaryIssues := () // disable bin comp check
|
||||
)
|
||||
)
|
||||
|
||||
lazy val osgiAries = Project(
|
||||
id = "akka-osgi-aries",
|
||||
base = file("akka-osgi-aries"),
|
||||
dependencies = Seq(osgi % "compile;test->test"),
|
||||
settings = defaultSettings ++ formatSettings ++ scaladocSettings ++ javadocSettings ++ OSGi.osgiAries ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiAries,
|
||||
parallelExecution in Test := false,
|
||||
reportBinaryIssues := () // disable bin comp check
|
||||
)
|
||||
|
|
@ -553,61 +488,73 @@ object AkkaBuild extends Build {
|
|||
|
||||
lazy val osgiDiningHakkersSample = Project(id = "akka-sample-osgi-dining-hakkers",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers"),
|
||||
settings = parentSettings
|
||||
settings = parentSettings ++ osgiSampleSettings
|
||||
) aggregate(osgiDiningHakkersSampleApi, osgiDiningHakkersSampleCommand, osgiDiningHakkersSampleCore,
|
||||
osgiDiningHakkersSampleIntegrationTest, uncommons)
|
||||
|
||||
lazy val osgiDiningHakkersSampleApi = Project(id = "akka-sample-osgi-dining-hakkers-api",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers/api"),
|
||||
settings = sampleSettings ++ OSGi.osgiDiningHakkersSampleApi
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ OSGi.osgiDiningHakkersSampleApi
|
||||
)dependsOn(actor)
|
||||
|
||||
lazy val osgiDiningHakkersSampleCommand = Project(id = "akka-sample-osgi-dining-hakkers-command",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers/command"),
|
||||
settings = sampleSettings ++ OSGi.osgiDiningHakkersSampleCommand ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiDiningHakkerSampleCommand
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ OSGi.osgiDiningHakkersSampleCommand ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiDiningHakkersSampleCommand
|
||||
)
|
||||
) dependsOn (osgiDiningHakkersSampleApi, actor)
|
||||
|
||||
lazy val osgiDiningHakkersSampleCore = Project(id = "akka-sample-osgi-dining-hakkers-core",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers/core"),
|
||||
settings = sampleSettings ++ OSGi.osgiDiningHakkersSampleCore ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiDiningHakkerSampleCore
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ OSGi.osgiDiningHakkersSampleCore ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiDiningHakkersSampleCore
|
||||
)
|
||||
) dependsOn (osgiDiningHakkersSampleApi, actor, remote, cluster, osgi)
|
||||
) dependsOn (osgiDiningHakkersSampleApi, actor, remote, cluster, persistence, osgi)
|
||||
|
||||
lazy val osgiDiningHakkersSampleTest = Project(id = "akka-sample-osgi-dining-hakkers-test",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers/integration-test"),
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ OSGi.osgiDiningHakkersSampleCore ++ Seq(
|
||||
libraryDependencies ++= Dependencies.osgiDiningHakkersSampleTest
|
||||
)
|
||||
) dependsOn (osgiDiningHakkersSampleCommand, osgiDiningHakkersSampleCore, testkit )
|
||||
|
||||
//TODO to remove it as soon as the uncommons gets OSGified, see ticket #2990
|
||||
lazy val uncommons = Project(id = "akka-sample-osgi-dining-hakkers-uncommons",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers/uncommons"),
|
||||
settings = sampleSettings ++ OSGi.osgiDiningHakkersSampleUncommons ++ Seq(
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ OSGi.osgiDiningHakkersSampleUncommons ++ Seq(
|
||||
libraryDependencies ++= Dependencies.uncommons,
|
||||
version := "1.2.0"
|
||||
)
|
||||
)
|
||||
|
||||
def executeMvnCommands(failureMessage: String, commands: String*) = {
|
||||
if ({List("sh", "-c", commands.mkString("cd akka-samples/akka-sample-osgi-dining-hakkers; mvn -U ", " ", "")) !} != 0)
|
||||
if ({List("sh", "-c", commands.mkString("cd akka-samples/akka-sample-osgi-dining-hakkers; mvn ", " ", "")) !} != 0)
|
||||
throw new Exception(failureMessage)
|
||||
}
|
||||
|
||||
lazy val osgiDiningHakkersSampleIntegrationTest = Project(id = "akka-sample-osgi-dining-hakkers-integration",
|
||||
base = file("akka-samples/akka-sample-osgi-dining-hakkers-integration"),
|
||||
settings = sampleSettings ++ (
|
||||
if (System.getProperty("akka.osgi.sample.test", "false").toBoolean) Seq(
|
||||
settings = sampleSettings ++ osgiSampleSettings ++ (
|
||||
if (System.getProperty("akka.osgi.sample.test", "true").toBoolean) Seq(
|
||||
test in Test ~= { x => {
|
||||
executeMvnCommands("Osgi sample Dining hakkers test failed", "clean", "install")
|
||||
}})
|
||||
}},
|
||||
// force publication of artifacts to local maven repo
|
||||
compile in Compile <<=
|
||||
(publishM2 in actor, publishM2 in testkit, publishM2 in remote, publishM2 in cluster, publishM2 in osgi,
|
||||
publishM2 in slf4j, publishM2 in persistence, compile in Compile) map
|
||||
((_, _, _, _, _, _, _, c) => c))
|
||||
else Seq.empty
|
||||
)
|
||||
) dependsOn(osgiDiningHakkersSampleApi, osgiDiningHakkersSampleCommand, osgiDiningHakkersSampleCore, uncommons)
|
||||
|
||||
|
||||
lazy val osgiSampleSettings: Seq[Setting[_]] = Seq(target := baseDirectory.value / "target-sbt")
|
||||
|
||||
lazy val docs = Project(
|
||||
id = "akka-docs",
|
||||
base = file("akka-docs"),
|
||||
dependencies = Seq(actor, testkit % "test->test",
|
||||
remote % "compile;test->test", cluster, slf4j, agent, zeroMQ, camel, osgi, osgiAries,
|
||||
remote % "compile;test->test", cluster, slf4j, agent, zeroMQ, camel, osgi,
|
||||
persistence % "compile;test->test"),
|
||||
settings = defaultSettings ++ docFormatSettings ++ site.settings ++ site.sphinxSupport() ++ site.publishSite ++ sphinxPreprocessing ++ cpsPlugin ++ Seq(
|
||||
sourceDirectory in Sphinx <<= baseDirectory / "rst",
|
||||
|
|
@ -826,7 +773,16 @@ object AkkaBuild extends Build {
|
|||
|
||||
// don't save test output to a file
|
||||
testListeners in (Test, test) := Seq(TestLogger(streams.value.log, {_ => streams.value.log }, logBuffered.value))
|
||||
)
|
||||
) ++ (System.getProperty("akka.build.M2Dir") match {
|
||||
case null => Seq.empty
|
||||
case path =>
|
||||
// Maven resolver settings
|
||||
Seq(
|
||||
otherResolvers :=
|
||||
Resolver.file("user-publish-m2-local", new File(path)) :: publishTo.value.toList,
|
||||
publishM2Configuration := Classpaths.publishConfig(packagedArtifacts.value, None, resolverName = "user-publish-m2-local", checksums = checksums.in(publishM2).value, logging = ivyLoggingLevel.value)
|
||||
)
|
||||
})
|
||||
|
||||
val validatePullRequest = TaskKey[Unit]("validate-pull-request", "Additional tasks for pull request validation")
|
||||
|
||||
|
|
@ -929,7 +885,7 @@ object AkkaBuild extends Build {
|
|||
case (false, _) => Seq.empty
|
||||
})
|
||||
|
||||
lazy val scaladocDiagramsEnabled = System.getProperty("akka.scaladoc.diagrams", "true").toBoolean
|
||||
lazy val scaladocDiagramsEnabled = System.getProperty("akka.scaladoc.diagrams", "false").toBoolean
|
||||
lazy val scaladocAutoAPI = System.getProperty("akka.scaladoc.autoapi", "true").toBoolean
|
||||
|
||||
lazy val scaladocOptions = List("-implicits") ::: (if (scaladocDiagramsEnabled) List("-diagrams") else Nil)
|
||||
|
|
@ -1010,13 +966,6 @@ object AkkaBuild extends Build {
|
|||
}
|
||||
}
|
||||
|
||||
def copyFile(source: String, sink: String){
|
||||
val src = new java.io.File(source)
|
||||
val dest = new java.io.File(sink)
|
||||
new java.io.FileOutputStream(dest) getChannel() transferFrom(
|
||||
new java.io.FileInputStream(src) getChannel, 0, Long.MaxValue )
|
||||
}
|
||||
|
||||
// OSGi settings
|
||||
|
||||
object OSGi {
|
||||
|
|
@ -1028,8 +977,15 @@ object AkkaBuild extends Build {
|
|||
packagedArtifact in (Compile, packageBin) <<= (artifact in (Compile, packageBin), OsgiKeys.bundle).identityMap
|
||||
)
|
||||
|
||||
//akka-actor is wrapped into akka-osgi to simplify OSGi deployement.
|
||||
|
||||
val actor = osgiSettings ++ Seq(
|
||||
OsgiKeys.exportPackage := Seq("akka*"),
|
||||
OsgiKeys.privatePackage := Seq("akka.osgi.impl"),
|
||||
//akka-actor packages are not imported, as contained in the CP
|
||||
OsgiKeys.importPackage := (osgiOptionalImports map optionalResolution) ++ Seq("!sun.misc", scalaImport(), configImport(), "*"),
|
||||
// dynamicImportPackage needed for loading classes defined in configuration
|
||||
OsgiKeys.dynamicImportPackage := Seq("*")
|
||||
)
|
||||
|
||||
val agent = exports(Seq("akka.agent.*"))
|
||||
|
||||
val camel = exports(Seq("akka.camel.*"))
|
||||
|
|
@ -1040,12 +996,7 @@ object AkkaBuild extends Build {
|
|||
|
||||
val mailboxesCommon = exports(Seq("akka.actor.mailbox.*"), imports = Seq(protobufImport()))
|
||||
|
||||
val osgi = osgiSettings ++ Seq(
|
||||
OsgiKeys.exportPackage := Seq("akka*"), //exporting akka packages enforces bnd to aggregate akka-actor packages in the bundle
|
||||
OsgiKeys.privatePackage := Seq("akka.osgi.impl"),
|
||||
//akka-actor packages are not imported, as contained in the CP
|
||||
OsgiKeys.importPackage := (osgiOptionalImports map optionalResolution) ++ Seq("!sun.misc", scalaImport(),configImport(), "*")
|
||||
)
|
||||
val osgi = exports(Seq("akka.osgi.*"))
|
||||
|
||||
val osgiDiningHakkersSampleApi = exports(Seq("akka.sample.osgi.api"))
|
||||
|
||||
|
|
@ -1055,8 +1006,6 @@ object AkkaBuild extends Build {
|
|||
|
||||
val osgiDiningHakkersSampleUncommons = exports(Seq("org.uncommons.maths.random")) ++ Seq(OsgiKeys.privatePackage := Seq("org.uncommons.maths.binary", "org.uncommons.maths", "org.uncommons.maths.number"))
|
||||
|
||||
val osgiAries = exports() ++ Seq(OsgiKeys.privatePackage := Seq("akka.osgi.aries.*"))
|
||||
|
||||
val remote = exports(Seq("akka.remote.*"), imports = Seq(protobufImport()))
|
||||
|
||||
val slf4j = exports(Seq("akka.event.slf4j.*"))
|
||||
|
|
@ -1071,25 +1020,11 @@ object AkkaBuild extends Build {
|
|||
|
||||
val zeroMQ = exports(Seq("akka.zeromq.*"), imports = Seq(protobufImport()) )
|
||||
|
||||
val osgiOptionalImports = Seq("akka.remote",
|
||||
"akka.remote.transport.netty",
|
||||
"akka.remote.security.provider",
|
||||
"akka.remote.netty",
|
||||
"akka.remote.routing",
|
||||
"akka.remote.transport",
|
||||
"akka.remote.serialization",
|
||||
"akka.cluster",
|
||||
"akka.cluster.routing",
|
||||
"akka.cluster.protobuf",
|
||||
"akka.transactor",
|
||||
"akka.agent",
|
||||
"akka.dataflow",
|
||||
"akka.actor.mailbox",
|
||||
"akka.camel.internal",
|
||||
"akka.camel.javaapi",
|
||||
"akka.camel",
|
||||
"akka.camel.internal.component",
|
||||
"akka.zeromq",
|
||||
val osgiOptionalImports = Seq(
|
||||
// needed because testkit is normally not used in the application bundle,
|
||||
// but it should still be included as transitive dependency and used by BundleDelegatingClassLoader
|
||||
// to be able to find refererence.conf
|
||||
"akka.testkit",
|
||||
"com.google.protobuf")
|
||||
|
||||
def exports(packages: Seq[String] = Seq(), imports: Seq[String] = Nil) = osgiSettings ++ Seq(
|
||||
|
|
@ -1098,7 +1033,7 @@ object AkkaBuild extends Build {
|
|||
)
|
||||
def defaultImports = Seq("!sun.misc", akkaImport(), configImport(), scalaImport(), "*")
|
||||
def akkaImport(packageName: String = "akka.*") = "%s;version=\"[2.3,2.4)\"".format(packageName)
|
||||
def configImport(packageName: String = "com.typesafe.config.*") = "%s;version=\"[0.4.1,1.1.0)\"".format(packageName)
|
||||
def configImport(packageName: String = "com.typesafe.config.*") = "%s;version=\"[1.2.0,1.3.0)\"".format(packageName)
|
||||
def protobufImport(packageName: String = "com.google.protobuf.*") = "%s;version=\"[2.5.0,2.6.0)\"".format(packageName)
|
||||
def scalaImport(packageName: String = "scala.*") = "%s;version=\"[2.10,2.11)\"".format(packageName)
|
||||
def optionalResolution(packageName: String) = "%s;resolution:=optional".format(packageName)
|
||||
|
|
@ -1121,21 +1056,30 @@ object Dependencies {
|
|||
object Compile {
|
||||
import Versions._
|
||||
|
||||
// Several dependencies are mirrored in the OSGi Dining Hackers maven project
|
||||
// They need to be changed in this file as well:
|
||||
// akka-samples/akka-sample-osgi-dining-hakkers/pom.xml
|
||||
|
||||
// Compile
|
||||
val camelCore = "org.apache.camel" % "camel-core" % "2.10.3" exclude("org.slf4j", "slf4j-api") // ApacheV2
|
||||
|
||||
val config = "com.typesafe" % "config" % "1.2.0" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val netty = "io.netty" % "netty" % "3.8.0.Final" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val protobuf = "com.google.protobuf" % "protobuf-java" % "2.5.0" // New BSD
|
||||
val scalaStm = "org.scala-stm" %% "scala-stm" % scalaStmVersion // Modified BSD (Scala)
|
||||
|
||||
val slf4jApi = "org.slf4j" % "slf4j-api" % "1.7.5" // MIT
|
||||
val zeroMQClient = "org.zeromq" %% "zeromq-scala-binding" % scalaZeroMQVersion // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val uncommonsMath = "org.uncommons.maths" % "uncommons-maths" % "1.2.2a" exclude("jfree", "jcommon") exclude("jfree", "jfreechart") // ApacheV2
|
||||
val ariesBlueprint = "org.apache.aries.blueprint" % "org.apache.aries.blueprint" % "1.1.0" // ApacheV2
|
||||
val osgiCore = "org.osgi" % "org.osgi.core" % "4.2.0" // ApacheV2
|
||||
val osgiCompendium= "org.osgi" % "org.osgi.compendium" % "4.2.0" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val osgiCore = "org.osgi" % "org.osgi.core" % "4.3.1" // ApacheV2
|
||||
val osgiCompendium= "org.osgi" % "org.osgi.compendium" % "4.3.1" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val levelDB = "org.iq80.leveldb" % "leveldb" % "0.5" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val levelDBNative = "org.fusesource.leveldbjni" % "leveldbjni-all" % "1.7" // New BSD
|
||||
|
||||
// Camel Sample
|
||||
|
|
@ -1157,13 +1101,18 @@ object Dependencies {
|
|||
val logback = "ch.qos.logback" % "logback-classic" % "1.0.13" % "test" // EPL 1.0 / LGPL 2.1
|
||||
val mockito = "org.mockito" % "mockito-all" % "1.8.1" % "test" // MIT
|
||||
// changing the scalatest dependency must be reflected in akka-docs/rst/dev/multi-jvm-testing.rst
|
||||
// mirrored in OSGi sample
|
||||
val scalatest = "org.scalatest" %% "scalatest" % scalaTestVersion % "test" // ApacheV2
|
||||
val scalacheck = "org.scalacheck" %% "scalacheck" % scalaCheckVersion % "test" // New BSD
|
||||
val ariesProxy = "org.apache.aries.proxy" % "org.apache.aries.proxy.impl" % "1.0.1" % "test" // ApacheV2
|
||||
val pojosr = "com.googlecode.pojosr" % "de.kalpatec.pojosr.framework" % "0.1.4" % "test" // ApacheV2
|
||||
val pojosr = "com.googlecode.pojosr" % "de.kalpatec.pojosr.framework" % "0.2.1" % "test" // ApacheV2
|
||||
val tinybundles = "org.ops4j.pax.tinybundles" % "tinybundles" % "1.0.0" % "test" // ApacheV2
|
||||
val log4j = "log4j" % "log4j" % "1.2.14" % "test" // ApacheV2
|
||||
val junitIntf = "com.novocode" % "junit-interface" % "0.8" % "test" // MIT
|
||||
// dining hakkers integration test using pax-exam
|
||||
// mirrored in OSGi sample
|
||||
val karafExam = "org.apache.karaf.tooling.exam" % "org.apache.karaf.tooling.exam.container" % "2.3.1" % "test" // ApacheV2
|
||||
// mirrored in OSGi sample
|
||||
val paxExam = "org.ops4j.pax.exam" % "pax-exam-junit4" % "2.6.0" % "test" // ApacheV2
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1201,14 +1150,14 @@ object Dependencies {
|
|||
|
||||
val osgi = Seq(osgiCore, osgiCompendium, Test.logback, Test.commonsIo, Test.pojosr, Test.tinybundles, Test.scalatest, Test.junit)
|
||||
|
||||
val osgiDiningHakkerSampleCore = Seq(config, osgiCore, osgiCompendium)
|
||||
val osgiDiningHakkersSampleCore = Seq(config, osgiCore, osgiCompendium)
|
||||
|
||||
val osgiDiningHakkerSampleCommand = Seq(osgiCore, osgiCompendium)
|
||||
val osgiDiningHakkersSampleCommand = Seq(osgiCore, osgiCompendium)
|
||||
|
||||
val osgiDiningHakkersSampleTest = Seq(osgiCore, osgiCompendium, Test.karafExam, Test.paxExam, Test.junit, Test.scalatest)
|
||||
|
||||
val uncommons = Seq(uncommonsMath)
|
||||
|
||||
val osgiAries = Seq(osgiCore, osgiCompendium, ariesBlueprint, Test.ariesProxy)
|
||||
|
||||
val docs = Seq(Test.scalatest, Test.junit, Test.junitIntf)
|
||||
|
||||
val zeroMQ = Seq(protobuf, zeroMQClient, Test.scalatest, Test.junit)
|
||||
|
|
|
|||
|
|
@ -11,15 +11,14 @@ object Unidoc {
|
|||
lazy val GenJavaDocEnabled = Option(sys.props("akka.genjavadoc.enabled")) filter (_.toLowerCase == "true") map (_ => true) getOrElse false
|
||||
|
||||
lazy val javadocSettings =
|
||||
inConfig(JavaDoc)(Defaults.configSettings) ++ Seq(
|
||||
packageDoc in Compile <<= packageDoc in JavaDoc,
|
||||
sources in JavaDoc <<= (target, compile in Compile, sources in Compile) map ((t, c, s) =>
|
||||
if (GenJavaDocEnabled) (t / "java" ** "*.java").get ++ s.filter(_.getName.endsWith(".java"))
|
||||
else throw new RuntimeException("cannot build java docs without -Dakka.genjavadoc.enabled=true")
|
||||
),
|
||||
javacOptions in JavaDoc := Seq(),
|
||||
artifactName in packageDoc in JavaDoc := ((sv, mod, art) => "" + mod.name + "_" + sv.binary + "-" + mod.revision + "-javadoc.jar")
|
||||
) ++ (if (GenJavaDocEnabled) Seq(
|
||||
inConfig(JavaDoc)(Defaults.configSettings) ++
|
||||
(if (GenJavaDocEnabled) Seq(
|
||||
packageDoc in Compile <<= packageDoc in JavaDoc,
|
||||
sources in JavaDoc <<= (target, compile in Compile, sources in Compile) map ((t, c, s) =>
|
||||
(t / "java" ** "*.java").get ++ s.filter(_.getName.endsWith(".java"))
|
||||
),
|
||||
javacOptions in JavaDoc := Seq(),
|
||||
artifactName in packageDoc in JavaDoc := ((sv, mod, art) => "" + mod.name + "_" + sv.binary + "-" + mod.revision + "-javadoc.jar"),
|
||||
libraryDependencies += Dependencies.Compile.genjavadoc,
|
||||
scalacOptions <+= target map (t => "-P:genjavadoc:out=" + (t / "java"))
|
||||
) else Nil)
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue