Adds support for the signature directive to the paradox configuration

(copied from the akka-http configuration)
This commit is contained in:
Luc Bourlier 2018-04-18 09:03:58 +00:00 committed by Arnout Engelen
parent d86730e001
commit 7b29b08d46
24 changed files with 165 additions and 112 deletions

View file

@ -1,23 +0,0 @@
# ++
++
## Signature
## Description
Just a shorthand for concat
@@@div { .callout }
**emits** when the current stream has an element available; if the current input completes, it tries the next one
**backpressures** when downstream backpressures
**completes** when all upstreams complete
@@@
## Example

View file

@ -2,8 +2,12 @@
Send the elements from the stream to an `ActorRef`. Send the elements from the stream to an `ActorRef`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Sink.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Sink.scala) { #actorRef }
@@@
## Description ## Description
Send the elements from the stream to an `ActorRef`. No backpressure so care must be taken to not overflow the inbox. Send the elements from the stream to an `ActorRef`. No backpressure so care must be taken to not overflow the inbox.

View file

@ -3,8 +3,6 @@
Send the elements from the stream to an `ActorRef` which must then acknowledge reception after completing a message, Send the elements from the stream to an `ActorRef` which must then acknowledge reception after completing a message,
to provide back pressure onto the sink. to provide back pressure onto the sink.
## Signature
## Description ## Description
Send the elements from the stream to an `ActorRef` which must then acknowledge reception after completing a message, Send the elements from the stream to an `ActorRef` which must then acknowledge reception after completing a message,

View file

@ -2,8 +2,12 @@
Attaches the given `Sink` to this `Flow`, meaning that elements that pass through this `Flow` will also be sent to the `Sink`. Attaches the given `Sink` to this `Flow`, meaning that elements that pass through this `Flow` will also be sent to the `Sink`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #alsoTo }
@@@
## Description ## Description
Attaches the given `Sink` to this `Flow`, meaning that elements that pass through this `Flow` will also be sent to the `Sink`. Attaches the given `Sink` to this `Flow`, meaning that elements that pass through this `Flow` will also be sent to the `Sink`.

View file

@ -2,8 +2,12 @@
Stream the values of an `immutable. Stream the values of an `immutable.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Source.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Source.scala) { #apply }
@@@
## Description ## Description
Stream the values of an `immutable.Seq`. Stream the values of an `immutable.Seq`.

View file

@ -2,14 +2,24 @@
Create a sink which materializes into an `InputStream` that can be read to trigger demand through the sink. Create a sink which materializes into an `InputStream` that can be read to trigger demand through the sink.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [StreamConverters.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/StreamConverters.scala) { #asInputStream }
@@@
## Description ## Description
Create a sink which materializes into an `InputStream` that can be read to trigger demand through the sink.
Bytes emitted through the stream will be available for reading through the `InputStream`
The `InputStream` will be ended when the stream flowing into this `Sink` completes, and the closing the
`InputStream` will cancel the inflow of this `Sink`.
@@@div { .callout } @@@div { .callout }
**cancels** when the `InputStream` is closed
**backpressures** when no read is pending on the `InputStream`
@@@ @@@
## Example ## Example

View file

@ -2,14 +2,26 @@
Create a sink which materializes into Java 8 `Stream` that can be run to trigger demand through the sink. Create a sink which materializes into Java 8 `Stream` that can be run to trigger demand through the sink.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [StreamConverters.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/StreamConverters.scala) { #asJavaStream }
@@@
## Description ## Description
Create a sink which materializes into Java 8 `Stream` that can be run to trigger demand through the sink.
Elements emitted through the stream will be available for reading through the Java 8 `Stream`.
The Java 8 `Stream` will be ended when the stream flowing into this `Sink` completes, and closing the Java
`Stream` will cancel the inflow of this `Sink`. Java `Stream` throws exception in case reactive stream failed.
Be aware that Java `Stream` blocks current thread while waiting on next element from downstream.
@@@div { .callout } @@@div { .callout }
**cancels** when the Java Stream is closed
**backpressures** when no read is pending on the Java Stream
@@@ @@@
## Example ## Example

View file

@ -2,14 +2,24 @@
Create a source that materializes into an `OutputStream`. Create a source that materializes into an `OutputStream`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [StreamConverters.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/StreamConverters.scala) { #asOutputStream }
@@@
## Description ## Description
Create a source that materializes into an `OutputStream`. When bytes are written to the `OutputStream` they
are emitted from the source.
The `OutputStream` will no longer be writable when the `Source` has been canceled from its downstream, and
closing the `OutputStream` will complete the `Source`.
@@@div { .callout } @@@div { .callout }
**emits** when bytes are written to the `OutputStream`
**completes** when the `OutputStream` is closed
@@@ @@@
## Example ## Example

View file

@ -1,9 +1,13 @@
# asPublisher # asPublisher
Integration with Reactive Streams, materializes into a `org. Integration with Reactive Streams, materializes into a `org.reactivestreams.Publisher`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Sink.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Sink.scala) { #asPublisher }
@@@
## Description ## Description

View file

@ -1,9 +1,13 @@
# asSubscriber # asSubscriber
Integration with Reactive Streams, materializes into a `org. Integration with Reactive Streams, materializes into a `org.reactivestreams.Subscriber`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Source.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Source.scala) { #asSubscriber }
@@@
## Description ## Description

View file

@ -2,8 +2,12 @@
Use the `ask` pattern to send a request-reply message to the target `ref` actor. Use the `ask` pattern to send a request-reply message to the target `ref` actor.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #ask }
@@@
## Description ## Description
Use the `ask` pattern to send a request-reply message to the target `ref` actor. Use the `ask` pattern to send a request-reply message to the target `ref` actor.

View file

@ -3,8 +3,13 @@
If the time between the emission of an element and the following downstream demand exceeds the provided timeout, If the time between the emission of an element and the following downstream demand exceeds the provided timeout,
the stream is failed with a `TimeoutException`. the stream is failed with a `TimeoutException`.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #backpressureTimeout }
@@@
## Description ## Description
If the time between the emission of an element and the following downstream demand exceeds the provided timeout, If the time between the emission of an element and the following downstream demand exceeds the provided timeout,

View file

@ -2,8 +2,6 @@
Fan-out the stream to several streams. Fan-out the stream to several streams.
## Signature
## Description ## Description
Fan-out the stream to several streams. Each upstream element is emitted to the first available downstream consumer. Fan-out the stream to several streams. Each upstream element is emitted to the first available downstream consumer.

View file

@ -3,8 +3,13 @@
Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there
is backpressure and a maximum number of batched elements is not yet reached. is backpressure and a maximum number of batched elements is not yet reached.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #batch }
@@@
## Description ## Description
Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there

View file

@ -3,8 +3,13 @@
Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there
is backpressure and a maximum weight batched elements is not yet reached. is backpressure and a maximum weight batched elements is not yet reached.
@@@ div { .group-scala }
## Signature ## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #batchWeighted }
@@@
## Description ## Description
Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there Allow for a slower downstream by passing incoming elements and a summary into an aggregate function as long as there

View file

@ -2,8 +2,6 @@
Emit each incoming element each of `n` outputs. Emit each incoming element each of `n` outputs.
## Signature
## Description ## Description
Emit each incoming element each of `n` outputs. Emit each incoming element each of `n` outputs.

View file

@ -0,0 +1,37 @@
# buffer
Allow for a temporarily faster upstream events by buffering `size` elements.
@@@ div { .group-scala }
## Signature
@@signature [Flow.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Flow.scala) { #buffer }
@@@
## Description
Allow for a temporarily faster upstream events by buffering `size` elements. When the buffer is full, a new element is
handled according to the specified `OverflowStrategy`:
* `backpressure` backpressue is applied upstream
* `dropHead` drops the oldest element in the buffer to make space for the new element
* `dropTail` drops the youngest element in the buffer to make space for the new element
* `dropBuffer` drops the entire buffer and buffers the new element
* `dropNew` drops the new element
* `fail` fails the flow with a `BufferOverflowException`
@@@div { .callout }
**emits** when downstream stops backpressuring and there is a pending element in the buffer
**backpressures** when `OverflowStrategy` is `backpressue` and buffer is full
**completes** when upstream completes and buffered elements has been drained, or when `OverflowStrategy` is `fail`, the buffer is full and a new element arrives
@@@
## Example

View file

@ -1,24 +0,0 @@
# buffer (Backpressure)
Allow for a temporarily faster upstream events by buffering `size` elements.
## Signature
## Description
Allow for a temporarily faster upstream events by buffering `size` elements. When the buffer is full backpressure
is applied.
@@@div { .callout }
**emits** when downstream stops backpressuring and there is a pending element in the buffer
**backpressures** when buffer is full
**completes** when upstream completes and buffered elements has been drained
@@@
## Example

View file

@ -1,29 +0,0 @@
# buffer (Drop)
Allow for a temporarily faster upstream events by buffering `size` elements.
## Signature
## Description
Allow for a temporarily faster upstream events by buffering `size` elements. When the buffer is full elements are
dropped according to the specified `OverflowStrategy`:
* `dropHead` drops the oldest element in the buffer to make space for the new element
* `dropTail` drops the youngest element in the buffer to make space for the new element
* `dropBuffer` drops the entire buffer and buffers the new element
* `dropNew` drops the new element
@@@div { .callout }
**emits** when downstream stops backpressuring and there is a pending element in the buffer
**backpressures** never (when dropping cannot keep up with incoming elements)
**completes** upstream completes and buffered elements has been drained
@@@
## Example

View file

@ -1,24 +0,0 @@
# buffer (Fail)
Allow for a temporarily faster upstream events by buffering `size` elements.
## Signature
## Description
Allow for a temporarily faster upstream events by buffering `size` elements. When the buffer is full the stage fails
the flow with a `BufferOverflowException`.
@@@div { .callout }
**emits** when downstream stops backpressuring and there is a pending element in the buffer
**backpressures** never, fails the stream instead of backpressuring when buffer is full
**completes** when upstream completes and buffered elements has been drained
@@@
## Example

View file

@ -4,6 +4,8 @@ Stream the values from an `Iterator`, requesting the next value when there is de
## Signature ## Signature
@@signature [BasicDirectives.scala]($akka$/akka-stream/src/main/scala/akka/stream/scaladsl/Source.scala) { #fromIterator }
## Description ## Description
Stream the values from an `Iterator`, requesting the next value when there is demand. The iterator will be created anew Stream the values from an `Iterator`, requesting the next value when there is demand. The iterator will be created anew

View file

@ -2093,9 +2093,7 @@ event, and may therefore affect performance.
* [batchWeighted](reference/batchWeighted.md) * [batchWeighted](reference/batchWeighted.md)
* [extrapolate](reference/extrapolate.md) * [extrapolate](reference/extrapolate.md)
* [expand](reference/expand.md) * [expand](reference/expand.md)
* [buffer (Backpressure)](reference/bufferBackpressure.md) * [buffer](reference/buffer.md)
* [buffer (Drop)](reference/bufferDrop.md)
* [buffer (Fail)](reference/bufferFail.md)
* [prefixAndTail](reference/prefixAndTail.md) * [prefixAndTail](reference/prefixAndTail.md)
* [groupBy](reference/groupBy.md) * [groupBy](reference/groupBy.md)
* [splitWhen](reference/splitWhen.md) * [splitWhen](reference/splitWhen.md)
@ -2116,7 +2114,6 @@ event, and may therefore affect performance.
* [zipWith](reference/zipWith.md) * [zipWith](reference/zipWith.md)
* [zipWithIndex](reference/zipWithIndex.md) * [zipWithIndex](reference/zipWithIndex.md)
* [concat](reference/concat.md) * [concat](reference/concat.md)
* [++](reference/++.md)
* [prepend](reference/prepend.md) * [prepend](reference/prepend.md)
* [orElse](reference/orElse.md) * [orElse](reference/orElse.md)
* [interleave](reference/interleave.md) * [interleave](reference/interleave.md)

View file

@ -228,6 +228,7 @@ lazy val docs = akkaModule("akka-docs")
"google.analytics.domain.name" -> "akka.io", "google.analytics.domain.name" -> "akka.io",
"snip.code.base_dir" -> (sourceDirectory in Test).value.getAbsolutePath, "snip.code.base_dir" -> (sourceDirectory in Test).value.getAbsolutePath,
"snip.akka.base_dir" -> (baseDirectory in ThisBuild).value.getAbsolutePath, "snip.akka.base_dir" -> (baseDirectory in ThisBuild).value.getAbsolutePath,
"signature.akka.base_dir" -> (baseDirectory in ThisBuild).value.getAbsolutePath,
"fiddle.code.base_dir" -> (sourceDirectory in Test).value.getAbsolutePath "fiddle.code.base_dir" -> (sourceDirectory in Test).value.getAbsolutePath
), ),
paradoxGroups := Map("Language" -> Seq("Scala", "Java")), paradoxGroups := Map("Language" -> Seq("Scala", "Java")),

View file

@ -4,6 +4,8 @@
package akka package akka
import java.io.{File, FileNotFoundException}
import _root_.io.github.lukehutch.fastclasspathscanner.FastClasspathScanner import _root_.io.github.lukehutch.fastclasspathscanner.FastClasspathScanner
import com.lightbend.paradox.markdown._ import com.lightbend.paradox.markdown._
import com.lightbend.paradox.sbt.ParadoxPlugin.autoImport._ import com.lightbend.paradox.sbt.ParadoxPlugin.autoImport._
@ -12,16 +14,22 @@ import org.pegdown.ast._
import sbt.Keys._ import sbt.Keys._
import sbt._ import sbt._
import scala.io.{Codec, Source}
import scala.collection.JavaConverters._ import scala.collection.JavaConverters._
object ParadoxSupport { object ParadoxSupport {
val paradoxWithCustomDirectives = Seq( val paradoxWithCustomDirectives = Seq(
paradoxDirectives ++= Def.taskDyn { paradoxDirectives ++= Def.taskDyn {
val log = streams.value.log
val classpath = (fullClasspath in Compile).value.files.map(_.toURI.toURL).toArray val classpath = (fullClasspath in Compile).value.files.map(_.toURI.toURL).toArray
val classloader = new java.net.URLClassLoader(classpath, this.getClass().getClassLoader()) val classloader = new java.net.URLClassLoader(classpath, this.getClass().getClassLoader())
lazy val scanner = new FastClasspathScanner("akka").addClassLoader(classloader).scan() lazy val scanner = new FastClasspathScanner("akka").addClassLoader(classloader).scan()
val allClasses = scanner.getNamesOfAllClasses.asScala.toVector val allClasses = scanner.getNamesOfAllClasses.asScala.toVector
Def.task { Seq( Def.task { Seq(
{ context: Writer.Context
new SignatureDirective(context.location.tree.label, context.properties, msg log.warn(msg))
},
{ _: Writer.Context new UnidocDirective(allClasses) } { _: Writer.Context new UnidocDirective(allClasses) }
)} )}
}.value }.value
@ -84,4 +92,47 @@ object ParadoxSupport {
} }
} }
} }
class SignatureDirective(page: Page, variables: Map[String, String], logWarn: String => Unit) extends LeafBlockDirective("signature") {
def render(node: DirectiveNode, visitor: Visitor, printer: Printer): Unit =
try {
val labels = node.attributes.values("identifier").asScala.map(_.toLowerCase())
val source = node.source match {
case direct: DirectiveNode.Source.Direct => direct.value
case _ => sys.error("Source references are not supported")
}
val file =
if (source startsWith "$") {
val baseKey = source.drop(1).takeWhile(_ != '$')
val base = new File(PropertyUrl(s"signature.$baseKey.base_dir", variables.get).base.trim)
val effectiveBase = if (base.isAbsolute) base else new File(page.file.getParentFile, base.toString)
new File(effectiveBase, source.drop(baseKey.length + 2))
} else new File(page.file.getParentFile, source)
val Signature = """\s*((def|val|type) (\w+)(?=[:(\[]).*)(\s+\=.*)""".r // stupid approximation to match a signature
//println(s"Looking for signature regex '$Signature'")
val text =
Source.fromFile(file)(Codec.UTF8).getLines.collect {
case line@Signature(signature, kind, l, definition) if labels contains l.toLowerCase() =>
//println(s"Found label '$l' with sig '$full' in line $line")
if (kind == "type") signature + definition
else signature
}.mkString("\n")
if (text.trim.isEmpty) {
logWarn(
s"Did not find any signatures with one of those names [${labels.mkString(", ")}] in ${node.source} " +
s"(was referenced from [${page.path}])")
new HtmlBlockNode(s"""<div style="color: red;">[Broken signature inclusion [${labels.mkString(", ")}] to [${node.source}]</div>""").accept(visitor)
} else {
val lang = Option(node.attributes.value("type")).getOrElse(Snippet.language(file))
new VerbatimNode(text, lang).accept(visitor)
}
} catch {
case e: FileNotFoundException =>
throw new SnipDirective.LinkException(s"Unknown snippet [${e.getMessage}] referenced from [${page.path}]")
}
}
} }