Packages

class S3ConnectorStreaming extends AnyRef

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. S3ConnectorStreaming
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new S3ConnectorStreaming()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @IntrinsicCandidate()
  6. var count_table: String
  7. var delta_meta_table: String
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  10. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  11. def get_spark_context(config: Config): SparkSession
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  13. var index_table: String
  14. def initialize_db(config: Config): Unit

    Initializes the private object of this class along with any table required in S3

  15. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  16. var last_checked_table: String
  17. var meta_table: String
  18. var metadata: MetaData
  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  22. var seq_table: String
  23. var single_table: String
  24. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  25. def toString(): String
    Definition Classes
    AnyRef → Any
  26. val unique_traces: Set[String]
  27. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  29. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. def write_count_table(pairs: Dataset[StreamingPair]): StreamingQuery

    Handles the writing of the metrics for each event pair in the CountTable.

    Handles the writing of the metrics for each event pair in the CountTable. This is the most complecated query in the streaming mode, because metrics should overwrite the previous records

    pairs

    The calculated pairs for the last batch

    returns

    Reference to the query, in order to use awaitTermination at the end of all the processes

  31. def write_index_table(pairs: Dataset[StreamingPair]): (StreamingQuery, StreamingQuery)

    Handles the writing of the event pairs in the IndexTable.

    Handles the writing of the event pairs in the IndexTable. The pairs have been calculated in the auth.datalab.siesta.Pipeline.SiestaStreamingPipeline using a stateful function.

    pairs

    The calculated pairs of the last batch

    returns

    Reference to the query, in order to use awaitTermination at the end of all the processes

  32. def write_sequence_table(df_events: Dataset[EventStream]): (StreamingQuery, StreamingQuery)

    Handles the writing of the events in the SequenceTable.

    Handles the writing of the events in the SequenceTable. There are 2 Streaming Queries. The first one is to continuous write the incoming events in the delta table and the second one is responsible to update the number of events and distinct traces in the metadata.

    df_events

    The events stream as it is read from the input source

    returns

    References to the two queries, in order to use awaitTermination at the end of all the processes

  33. def write_single_table(df_events: Dataset[EventStream]): StreamingQuery

    Handles the writing of the events in the SingleTable.

    Handles the writing of the events in the SingleTable. The original events are partitioned based on the event type and then written in the delta table.

    df_events

    The events stream as it is read from the input source

    returns

    Reference to the query, in order to use awaitTermination at the end of all the processes

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped