diff --git a/DATABASE.md b/DATABASE.md new file mode 100644 index 0000000..54bcf9b --- /dev/null +++ b/DATABASE.md @@ -0,0 +1,273 @@ +# API Bridges - Setting up a KQL database + +The APi bridges write data to a Kafka topic or Azure Event Hubs +or Fabric Event Streams. From Event Hubs and Event Streams, you can easily feed +this data into a Fabric Eventhouse or Azure Data Explorer (Kusto) database to +analyze and visualize it. + +This document explains how to set up a Kusto database and ingest data from the +API bridges. The principles are the same for all the bridges. + +## Creating the database - Microsoft Fabric Eventhouse + +First, you need access to Microsoft Fabric. If you don't have access yet, you +can easily set up a trial account for yourself. To set up a trial account, +follow the instructions in the +[Microsoft Fabric documentation](https://learn.microsoft.com/en-us/fabric/get-started/fabric-trial). + +If that's "TL;DR" for you, [click here and get going](https://app.fabric.microsoft.com/home). + +To set up a KQL database in Microsoft Fabric, follow the following steps: + +1. [Create an Eventhouse](https://learn.microsoft.com/en-us/fabric/real-time-intelligence/create-eventhouse) or reuse an existing one. +2. [Create a KQL database](https://learn.microsoft.com/en-us/fabric/real-time-intelligence/create-database) in the Eventhouse. + +Once you have set up the KQL database, collect these two pieces of information: + +1. The `databaseName` of the KQL database. +2. The `clusterUri` of the Eventhouse. You can find this in the Eventhouse + overview in the Microsoft Fabric portal under "Eventhouse details". + + ![Cluster URI](tools/media/eventhouse-details.png) + +## Creating the database - Azure Data Explorer (Kusto) + +First, you need an Azure subscription. If you don't have an Azure subscription +yet, you can easily set up a trial account for yourself. To set up a trial +account, follow the instructions in the +[Azure documentation](https://azure.microsoft.com/en-us/free/). + +To set up a KQL database in Azure Data Explorer (Kusto), follow the steps in the [Create a Kusto cluster and database](https://docs.microsoft.com/en-us/azure/data-explorer/create-cluster-database-portal) article. + +Once you have set up the KQL database, collect these two pieces of information: + +1. The `databaseName` of the KQL database. +2. The `clusterUri` of the Kusto cluster. You can find this in the Azure portal + under "Overview" in the Azure Data Explorer (Kusto) resource. It's the value + of the "URI" field. + +## Adding the event data schemas to the KQL database + +The KQL scripts in the projects contains the necessary table schemas, +materialized views, and update policies to organize the event +data in the the KQL database. Consider this your "bronze" layer. + +The scripts are located here: +- GTFS: [gtfs.kql](gtfs/kql/gtfs.kql) +- NOAA: [noaa.kql](noaa/kql/noaa.kql) +- PegelOnline: [pegelonline.kql](pegelonline/kql/pegelonline.kql) +- RSS: [rss.kql](rss/kql/feeds.kql) + +You can run the script one-by-one in the Kusto Query Language (KQL) editor in +the Fabric or Azure database portal, or you can use the `kusto.cli` tool for +which you find two helper scripts in the [tools](tools) directory. + +### Installing the Kusto CLI tool + +To install the Kusto CLI tool, run the following command: + +```shell +.\tools\install-kusto-cli.ps1 +``` + +The script downloads the Kusto CLI tool, installs it into the `KustoCLI` directory of your user profile, and adds tools directory inside of that to the user's PATH. If you run the tool from Windows Powershell, the .NET 4.7 version is installed, if you run from Powershell 7+, the .NET 6.0 version is installed. + +### Running the KQL script + +To run the KQL script, use the following command: + +```shell +.\tools\run-kql-script.ps1 -clusterUri '' -databaseName '' -script .\gtfs\kql\gtfs.kql +``` + +Replace `` and `` with the values you collected earlier. + +This assumes that you run the script from the root of the repository. If you run it from a different location, adjust the path to the script accordingly. + +### The schema and how it works + +The script creates a table for each payload schema of every event that is being +sent by the bridge. The table contains the "top level" fields of the payload +schema flattened into columns. If a payload field is an object or an array, the +column type is `dynamic`. The CloudEvents metadata is stored in the table as +well, with each attribute name prefixed with `___`. + +Example: + +```kusto +.create-merge table [Trips] ( + [routeId]: string, + [serviceDates]: dynamic, + [serviceExceptions]: dynamic, + [tripId]: string, + [tripHeadsign]: string, + [tripShortName]: string, + [directionId]: string, + [blockId]: string, + [shapeId]: string, + [wheelchairAccessible]: string, + [bikesAllowed]: string, + [___type]: string, + [___source]: string, + [___id]: string, + [___time]: datetime, + [___subject]: string +); +``` + +The script also creates a materialized view for each payload schema that gives +you the latest event for each unique combination of the CloudEvent `___type`, +`___source`, and `___subject`. This is useful for querying the latest state of +each entity. The bridge will generally populate the `___subject` with a key +value for the entity (the `tripId` in the example above), so you can easily +query the latest state and avoid reading duplicates in the form of older +versions. + +```kusto +.create materialized-view with (backfill=true) TripsLatest on table Trips { + Trips | summarize arg_max(___time, *) by ___type, ___source, ___subject +} +``` + +You will also find that the input schema's descriptive metadata is added +to the table as table and column docstrings. + +Inside the docstrings you may find two fields: + +- `description`: A human-readable description of the field. +- `schema`: An Apache Avro schema that describes the field's data type and possible values. This is added if the available type information is richer than what can be expressed in KQL. + +Example: + +``` kusto +.alter table [Trips] column-docstrings ( + [routeId]: "{\"description\": \"Identifies a route.\"}", + [tripId]: "{\"description\": \"Identifies a trip.\"}", + [tripHeadsign]: "{\"description\": \"Text that appears on signage identifying the trip's destination to riders.\", \"schema\": [\"null\", \"string\"]}", + [tripShortName]: "{\"description\": \"Public facing text used to identify the trip to riders.\", \"schema\": [\"null\", \"string\"]}", + [directionId]: "{\"description\": \"Indicates the direction of travel for a trip.\", \"schema\": {\"type\": \"enum\", \"name\": \"DirectionId\", \"namespace\": \"GeneralTransitFeedStatic\", \"symbols\": [\"OUTBOUND\", \"INBOUND\"], \"doc\": \"Indicates the direction of travel for a trip. Symbols: OUTBOUND - Travel in one direction; INBOUND - Travel in the opposite direction.\"}}", + [blockId]: "{\"description\": \"Identifies the block to which the trip belongs.\", \"schema\": [\"null\", \"string\"]}", + [shapeId]: "{\"description\": \"Identifies a geospatial shape describing the vehicle travel path for a trip.\", \"schema\": [\"null\", \"string\"]}", + [wheelchairAccessible]: "{\"description\": \"Indicates wheelchair accessibility.\", \"schema\": {\"type\": \"enum\", \"name\": \"WheelchairAccessible\", \"namespace\": \"GeneralTransitFeedStatic\", \"symbols\": [\"NO_INFO\", \"WHEELCHAIR_ACCESSIBLE\", \"NOT_WHEELCHAIR_ACCESSIBLE\"], \"doc\": \"Indicates wheelchair accessibility. Symbols: NO_INFO - No accessibility information for the trip; WHEELCHAIR_ACCESSIBLE - Vehicle can accommodate at least one rider in a wheelchair; NOT_WHEELCHAIR_ACCESSIBLE - No riders in wheelchairs can be accommodated on this trip.\"}}", + [bikesAllowed]: "{\"description\": \"Indicates whether bikes are allowed.\", \"schema\": {\"type\": \"enum\", \"name\": \"BikesAllowed\", \"namespace\": \"GeneralTransitFeedStatic\", \"symbols\": [\"NO_INFO\", \"BICYCLE_ALLOWED\", \"BICYCLE_NOT_ALLOWED\"], \"doc\": \"Indicates whether bikes are allowed. Symbols: NO_INFO - No bike information for the trip; BICYCLE_ALLOWED - Vehicle can accommodate at least one bicycle; BICYCLE_NOT_ALLOWED - No bicycles are allowed on this trip.\"}}", + [___type] : 'Event type', + [___source]: 'Context origin/source of the event', + [___id]: 'Event identifier', + [___time]: 'Event generation time', + [___subject]: 'Context subject of the event' +); +``` + +The script also creates an update policy for each table that imports "its" events from +the `_cloudevents_ingest` table. This is explained further in the next section. + +````kusto +.alter table [Trips] policy update +``` +[{ + "IsEnabled": true, + "Source": "_cloudevents_dispatch", + "Query": "_cloudevents_dispatch | where (specversion == '1.0' and type == 'GeneralTransitFeedStatic.Trips') | project['routeId'] = tostring(data.['routeId']),['serviceDates'] = todynamic(data.['serviceDates']),['serviceExceptions'] = todynamic(data.['serviceExceptions']),['tripId'] = tostring(data.['tripId']),['tripHeadsign'] = tostring(data.['tripHeadsign']),['tripShortName'] = tostring(data.['tripShortName']),['directionId'] = tostring(data.['directionId']),['blockId'] = tostring(data.['blockId']),['shapeId'] = tostring(data.['shapeId']),['wheelchairAccessible'] = tostring(data.['wheelchairAccessible']),['bikesAllowed'] = tostring(data.['bikesAllowed']),___type = type,___source = source,___id = ['id'],___time = ['time'],___subject = subject", + "IsTransactional": false, + "PropagateIngestionProperties": true, +}] +``` +```` + +Finally, there are two ingestion mappings created for each table. The `_json_flat` mapping assumes that the +arriving events are already in the flattened form of the table schema. The `_json_ce_structured` mapping assumes that the arriving events are in CloudEvents structured JSON format and extracts the payload columns from the `data` field. + + +## Ingesting data from the API bridge + +Now that you have set up the KQL database and added the necessary schemas, you +still need to wire up the API bridge to the KQL database. + +The table schema elements added by the script allow for several avenues for +ingesting data. We will discuss two of them here: + +### Ingesting events data via the `_cloudevents_dispatch` table. + +In this scenario, you will set up your KQL database to ingest data from the +Event Stream or Event Hub into the `_cloudevents_dispatch` table, using the +`_cloudevents_dispatch_json` JSON ingestion mapping. + +#### Setting up the ingestion in Microsoft Fabric + +Before you start, the bridge should be running or should at least already have +run once to send some events to the target Event Stream or Event Hub. It is +required for data to be in the Event Stream or Event Hub to finish the setup. + +1. In the Microsoft Fabric portal, navigate to the Eventhouse and database you + set up. + +2. Find the "Get Data" button in the ribbon, and click it or expand its menu. + Select either "Event Hub" or "Event Stream" depending on which you are + publishing to:
+ +3. In the "Pick a destination table and configure the source" dialog, select the + `_cloudevents_dispatch` table, then complete the source configuration.
+ + Detailed instructions for how to connect to Event Hubs and Event Streams sources can + be found in the documentation: + + - [Get data from Azure Event Hubs](https://learn.microsoft.com/en-us/fabric/real-time-intelligence/get-data-event-hub) + + - [Get data from Azure Event Streams](https://learn.microsoft.com/en-us/fabric/real-time-intelligence/get-data-event-streams) + +4. It is recommended to open up the "Advanced filters" section and set the + "Event retrieval start date" to date well before you started the bridge for + the first time. + +5. Click "Next". On the upper right hand side of the "Inspect the data" dialog, + make sure that "Format" is "JSON" and then click "Advanced", select "existing + mapping" in the box and choose the `_cloudevents_dispatch_mapping` mapping. +
+ +6. Click "Finish" to start the ingestion. + +Once you've completed these steps, events will start showing up in the `_cloudevents_dispatch` table +and the update policies will take care of importing the events into the per-event-type tables. + +#### Setting up the ingestion in Azure Data Explorer (Kusto) + +With Azure Data Explorer, the steps are similar to the ones for Microsoft Fabric, but the +portal entry point is different. + +Follow the instructions in the +[Create an Event Hubs data connection for Azure Data Explorer](https://learn.microsoft.com/en-us/azure/data-explorer/create-event-hubs-connection?tabs=get-data%2Cget-data-2) +article, and select the table and mappings as described above. + +### Ingesting directly into the event tables from Azure Stream Analytics + +If you are using Azure Stream Analytics to pre-process the data from the Event Hub +or Event Stream, you can push the event data directly into the event tables. + +1. In the Azure portal, navigate to the Azure Stream Analytics job that you have + set up to process the data from the Event Hub or Event Stream. +2. In the job's "Overview" pane, click on the "Outputs" tab and add a new output +3. Select "Azure Data Explorer" as the output type. +4. Configure the output with the connection string of the Azure Data Explorer + cluster and the database name. +5. In the "Table" field, enter the name of the event table you want to target. + +In the "Query" field, you can use the `SELECT` statement to map the incoming +data to the table schema, filtered by `type`. For example: + +```sql +SELECT + data.*, + type as ___type, + source as ___source, + id as ___id, + time as ___time, + subject as ___subject +INTO + Trips +FROM + input_stream +WHERE + specversion = '1.0' AND type = 'GeneralTransitFeedStatic.Trips' +``` + diff --git a/gtfs/CONTAINER.md b/gtfs/CONTAINER.md index 500ea5c..7c7d4e1 100644 --- a/gtfs/CONTAINER.md +++ b/gtfs/CONTAINER.md @@ -32,6 +32,12 @@ Use as base image in Dockerfile: FROM ghcr.io/clemensv/real-time-sources-gtfs:latest ``` +## Database Schemas and handling + +If you want to build a full data pipeline with all events ingested into +database, the integration with Fabric Eventhouse and Azure Data Explorer is +described in [DATABASE.md](../DATABASE.md). + ## Using the container image The container image defines a single command that starts the bridge. The bridge diff --git a/gtfs/gtfs_rt_bridge/src/gtfs_rt_bridge/gtfs_cli.py b/gtfs/gtfs_rt_bridge/src/gtfs_rt_bridge/gtfs_cli.py index acb64a1..f8154a8 100644 --- a/gtfs/gtfs_rt_bridge/src/gtfs_rt_bridge/gtfs_cli.py +++ b/gtfs/gtfs_rt_bridge/src/gtfs_rt_bridge/gtfs_cli.py @@ -107,7 +107,6 @@ logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) - def fetch_schedule_file(gtfs_url: str, mdb_source_id: str, gtfs_headers: List[List[str]], etag: str, cache_dir: str | None) -> Tuple[str, str]: """ Fetches the latest schedule file from the schedule URL if the file does not exist in the cache. @@ -1049,10 +1048,9 @@ async def feed_realtime_messages(agency_id: str, kafka_bootstrap_servers: str, k "linger.ms": 100, "retries": 5, "retry.backoff.ms": 1000, - "batch.size": (1024*1024)-512, - "compression.type": "gzip" + "batch.size": (1024*1024)-512 } - producer: Producer = Producer(kafka_config) + producer: Producer = Producer(kafka_config, logger=logger) gtfs_rt_producer = GeneralTransitFeedRealTimeEventProducer(producer, kafka_topic,cloudevents_mode) gtfs_static_producer = GeneralTransitFeedStaticEventProducer(producer, kafka_topic, cloudevents_mode) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/__init__.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/__init__.py index dee27db..c3bf7b1 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/__init__.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/__init__.py @@ -1,4 +1,4 @@ -from .generaltransitfeedstatic import agency, locationtype, wheelchairboarding, stops, faremedia, transfers, pickuptype, dropofftype, continuouspickup, continuousdropoff, timepoint, stoptimes, bookingrules, fareproducts, faretransferrules, pathways, shapes, locationgroupstores, translations, locationgeojson, serviceavailability, calendar, exceptiontype, calendardates, timeframes, fareattributes, levels, directionid, wheelchairaccessible, bikesallowed, trips, frequencies, routetype, routes, stopareas, attributions, feedinfo, locationgroups, routenetworks, areas, networks, farerules, farelegrules -from .generaltransitfeedrealtime import alert, trip, vehicle +from .generaltransitfeedstatic import bookingrules, fareproducts, locationtype, wheelchairboarding, stops, pathways, serviceavailability, calendar, exceptiontype, calendardates, timeframes, stopareas, fareattributes, locationgroups, pickuptype, dropofftype, continuouspickup, continuousdropoff, timepoint, stoptimes, faremedia, faretransferrules, locationgeojson, transfers, levels, frequencies, directionid, wheelchairaccessible, bikesallowed, trips, farelegrules, attributions, routenetworks, shapes, agency, locationgroupstores, areas, routetype, routes, networks, farerules, translations, feedinfo +from .generaltransitfeedrealtime import trip, alert, vehicle -__all__ = ["agency", "locationtype", "wheelchairboarding", "stops", "faremedia", "transfers", "pickuptype", "dropofftype", "continuouspickup", "continuousdropoff", "timepoint", "stoptimes", "bookingrules", "fareproducts", "faretransferrules", "pathways", "shapes", "locationgroupstores", "translations", "locationgeojson", "serviceavailability", "calendar", "exceptiontype", "calendardates", "timeframes", "fareattributes", "levels", "directionid", "wheelchairaccessible", "bikesallowed", "trips", "frequencies", "routetype", "routes", "stopareas", "attributions", "feedinfo", "locationgroups", "routenetworks", "areas", "networks", "farerules", "farelegrules", "alert", "trip", "vehicle"] +__all__ = ["bookingrules", "fareproducts", "locationtype", "wheelchairboarding", "stops", "pathways", "serviceavailability", "calendar", "exceptiontype", "calendardates", "timeframes", "stopareas", "fareattributes", "locationgroups", "pickuptype", "dropofftype", "continuouspickup", "continuousdropoff", "timepoint", "stoptimes", "faremedia", "faretransferrules", "locationgeojson", "transfers", "levels", "frequencies", "directionid", "wheelchairaccessible", "bikesallowed", "trips", "farelegrules", "attributions", "routenetworks", "shapes", "agency", "locationgroupstores", "areas", "routetype", "routes", "networks", "farerules", "translations", "feedinfo", "trip", "alert", "vehicle"] diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/__init__.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/__init__.py index 6e3d67b..2f3a8e9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/__init__.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/__init__.py @@ -1,5 +1,5 @@ -from .alert import timerange, tripdescriptor_types, tripdescriptor, entityselector, alert_types, translatedstring_types, translatedstring, alert from .trip import tripdescriptor_types, tripdescriptor, vehicledescriptor, tripupdate_types, tripupdate +from .alert import timerange, tripdescriptor_types, tripdescriptor, entityselector, alert_types, translatedstring_types, translatedstring, alert from .vehicle import tripdescriptor_types, tripdescriptor, vehicledescriptor, position, vehicleposition_types, vehicleposition -__all__ = ["timerange", "tripdescriptor_types", "tripdescriptor", "entityselector", "alert_types", "translatedstring_types", "translatedstring", "alert", "tripdescriptor_types", "tripdescriptor", "vehicledescriptor", "tripupdate_types", "tripupdate", "tripdescriptor_types", "tripdescriptor", "vehicledescriptor", "position", "vehicleposition_types", "vehicleposition"] +__all__ = ["tripdescriptor_types", "tripdescriptor", "vehicledescriptor", "tripupdate_types", "tripupdate", "timerange", "tripdescriptor_types", "tripdescriptor", "entityselector", "alert_types", "translatedstring_types", "translatedstring", "alert", "tripdescriptor_types", "tripdescriptor", "vehicledescriptor", "position", "vehicleposition_types", "vehicleposition"] diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/alert.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/alert.py index 68cbb2d..82b2082 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/alert.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/alert.py @@ -8,10 +8,10 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.entityselector import EntitySelector +from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.alert_types.cause import Cause from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.alert_types.effect import Effect from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.timerange import TimeRange -from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.alert_types.cause import Cause +from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.entityselector import EntitySelector from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.translatedstring import TranslatedString @@ -35,7 +35,7 @@ class Alert: effect: typing.Optional[Effect]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="effect")) url: typing.Optional[TranslatedString]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="url")) header_text: typing.Optional[TranslatedString]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="header_text")) - description_text: typing.Optional[TranslatedString]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="description_text")) + description_text: typing.Optional[TranslatedString]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="description_text")) def __post_init__(self): @@ -100,7 +100,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/entityselector.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/entityselector.py index a3d273a..aa94d02 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/entityselector.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/entityselector.py @@ -27,7 +27,7 @@ class EntitySelector: route_id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="route_id")) route_type: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="route_type")) trip: typing.Optional[TripDescriptor]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="trip")) - stop_id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stop_id")) + stop_id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stop_id")) def __post_init__(self): @@ -90,7 +90,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/timerange.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/timerange.py index 165d7ab..975b02e 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/timerange.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/timerange.py @@ -20,7 +20,7 @@ class TimeRange: end (typing.Optional[int]): End time, in POSIX time (i.e., number of seconds since January 1st 1970 00:00:00 UTC). If missing, the interval ends at plus infinity.""" start: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start")) - end: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="end")) + end: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="end")) def __post_init__(self): @@ -80,7 +80,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring.py index dd90c3e..a5dfad0 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring.py @@ -19,7 +19,7 @@ class TranslatedString: Attributes: translation (typing.List[Translation]): At least one translation must be provided.""" - translation: typing.List[Translation]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="translation")) + translation: typing.List[Translation]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="translation")) def __post_init__(self): @@ -78,7 +78,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring_types/translation.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring_types/translation.py index 88256ed..27506f2 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring_types/translation.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/translatedstring_types/translation.py @@ -20,7 +20,7 @@ class Translation: language (typing.Optional[str]): BCP-47 language code. Can be omitted if the language is unknown or if no i18n is done at all for the feed. At most one translation is allowed to have an unspecified language tag.""" text: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="text")) - language: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="language")) + language: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="language")) def __post_init__(self): @@ -80,7 +80,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/tripdescriptor.py index 50e00ed..90903d2 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/alert/tripdescriptor.py @@ -29,7 +29,7 @@ class TripDescriptor: direction_id: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="direction_id")) start_time: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_time")) start_date: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_date")) - schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) + schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) def __post_init__(self): @@ -93,7 +93,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripdescriptor.py index 20b9f24..26153dc 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripdescriptor.py @@ -29,7 +29,7 @@ class TripDescriptor: direction_id: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="direction_id")) start_time: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_time")) start_date: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_date")) - schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) + schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) def __post_init__(self): @@ -93,7 +93,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate.py index 238e2c5..ea36c0a 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate.py @@ -8,9 +8,9 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.vehicledescriptor import VehicleDescriptor -from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeupdate import StopTimeUpdate from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripdescriptor import TripDescriptor +from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeupdate import StopTimeUpdate +from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.vehicledescriptor import VehicleDescriptor @dataclasses_json.dataclass_json @@ -29,7 +29,7 @@ class TripUpdate: vehicle: typing.Optional[VehicleDescriptor]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="vehicle")) stop_time_update: typing.List[StopTimeUpdate]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stop_time_update")) timestamp: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timestamp")) - delay: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="delay")) + delay: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="delay")) def __post_init__(self): @@ -93,7 +93,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeevent.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeevent.py index eb96bd6..d2ad956 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeevent.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeevent.py @@ -22,7 +22,7 @@ class StopTimeEvent: delay: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="delay")) time: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="time")) - uncertainty: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="uncertainty")) + uncertainty: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="uncertainty")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeupdate.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeupdate.py index 7f8f2a7..74395b7 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeupdate.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/tripupdate_types/stoptimeupdate.py @@ -8,8 +8,8 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeevent import StopTimeEvent from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeupdate_types.schedulerelationship import ScheduleRelationship +from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeevent import StopTimeEvent @dataclasses_json.dataclass_json @@ -28,7 +28,7 @@ class StopTimeUpdate: stop_id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stop_id")) arrival: typing.Optional[StopTimeEvent]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="arrival")) departure: typing.Optional[StopTimeEvent]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="departure")) - schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) + schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) def __post_init__(self): @@ -91,7 +91,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/vehicledescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/vehicledescriptor.py index 6b0c308..b2b6c1b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/vehicledescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/trip/vehicledescriptor.py @@ -22,7 +22,7 @@ class VehicleDescriptor: id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="id")) label: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="label")) - license_plate: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="license_plate")) + license_plate: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="license_plate")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/position.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/position.py index 707d7ea..0839135 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/position.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/position.py @@ -26,7 +26,7 @@ class Position: longitude: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="longitude")) bearing: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bearing")) odometer: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="odometer")) - speed: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="speed")) + speed: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="speed")) def __post_init__(self): @@ -89,7 +89,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/tripdescriptor.py index 739868c..2decf10 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/tripdescriptor.py @@ -29,7 +29,7 @@ class TripDescriptor: direction_id: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="direction_id")) start_time: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_time")) start_date: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="start_date")) - schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) + schedule_relationship: typing.Optional[ScheduleRelationship]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="schedule_relationship")) def __post_init__(self): @@ -93,7 +93,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicledescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicledescriptor.py index 6b0c308..b2b6c1b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicledescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicledescriptor.py @@ -22,7 +22,7 @@ class VehicleDescriptor: id: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="id")) label: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="label")) - license_plate: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="license_plate")) + license_plate: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="license_plate")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicleposition.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicleposition.py index 0700690..c3fb53c 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicleposition.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedrealtime/vehicle/vehicleposition.py @@ -9,10 +9,10 @@ import dataclasses_json import json from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicleposition_types.congestionlevel import CongestionLevel +from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicledescriptor import VehicleDescriptor from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.tripdescriptor import TripDescriptor from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicleposition_types.occupancystatus import OccupancyStatus from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicleposition_types.vehiclestopstatus import VehicleStopStatus -from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicledescriptor import VehicleDescriptor from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.position import Position @@ -40,7 +40,7 @@ class VehiclePosition: current_status: typing.Optional[VehicleStopStatus]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="current_status")) timestamp: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timestamp")) congestion_level: typing.Optional[CongestionLevel]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="congestion_level")) - occupancy_status: typing.Optional[OccupancyStatus]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="occupancy_status")) + occupancy_status: typing.Optional[OccupancyStatus]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="occupancy_status")) def __post_init__(self): @@ -107,7 +107,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/__init__.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/__init__.py index 1ec496e..deafcc1 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/__init__.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/__init__.py @@ -1,45 +1,45 @@ -from .agency import Agency +from .bookingrules import BookingRules +from .fareproducts import FareProducts from .locationtype import LocationType from .wheelchairboarding import WheelchairBoarding from .stops import Stops -from .faremedia import FareMedia -from .transfers import Transfers +from .pathways import Pathways +from .serviceavailability import ServiceAvailability +from .calendar import Calendar +from .exceptiontype import ExceptionType +from .calendardates import CalendarDates +from .timeframes import Timeframes +from .stopareas import StopAreas +from .fareattributes import FareAttributes +from .locationgroups import LocationGroups from .pickuptype import PickupType from .dropofftype import DropOffType from .continuouspickup import ContinuousPickup from .continuousdropoff import ContinuousDropOff from .timepoint import Timepoint from .stoptimes import StopTimes -from .bookingrules import BookingRules -from .fareproducts import FareProducts +from .faremedia import FareMedia from .faretransferrules import FareTransferRules -from .pathways import Pathways -from .shapes import Shapes -from .locationgroupstores import LocationGroupStores -from .translations import Translations from .locationgeojson import LocationGeoJson -from .serviceavailability import ServiceAvailability -from .calendar import Calendar -from .exceptiontype import ExceptionType -from .calendardates import CalendarDates -from .timeframes import Timeframes -from .fareattributes import FareAttributes +from .transfers import Transfers from .levels import Levels +from .frequencies import Frequencies from .directionid import DirectionId from .wheelchairaccessible import WheelchairAccessible from .bikesallowed import BikesAllowed from .trips import Trips -from .frequencies import Frequencies -from .routetype import RouteType -from .routes import Routes -from .stopareas import StopAreas +from .farelegrules import FareLegRules from .attributions import Attributions -from .feedinfo import FeedInfo -from .locationgroups import LocationGroups from .routenetworks import RouteNetworks +from .shapes import Shapes +from .agency import Agency +from .locationgroupstores import LocationGroupStores from .areas import Areas +from .routetype import RouteType +from .routes import Routes from .networks import Networks from .farerules import FareRules -from .farelegrules import FareLegRules +from .translations import Translations +from .feedinfo import FeedInfo -__all__ = ["Agency", "LocationType", "WheelchairBoarding", "Stops", "FareMedia", "Transfers", "PickupType", "DropOffType", "ContinuousPickup", "ContinuousDropOff", "Timepoint", "StopTimes", "BookingRules", "FareProducts", "FareTransferRules", "Pathways", "Shapes", "LocationGroupStores", "Translations", "LocationGeoJson", "ServiceAvailability", "Calendar", "ExceptionType", "CalendarDates", "Timeframes", "FareAttributes", "Levels", "DirectionId", "WheelchairAccessible", "BikesAllowed", "Trips", "Frequencies", "RouteType", "Routes", "StopAreas", "Attributions", "FeedInfo", "LocationGroups", "RouteNetworks", "Areas", "Networks", "FareRules", "FareLegRules"] +__all__ = ["BookingRules", "FareProducts", "LocationType", "WheelchairBoarding", "Stops", "Pathways", "ServiceAvailability", "Calendar", "ExceptionType", "CalendarDates", "Timeframes", "StopAreas", "FareAttributes", "LocationGroups", "PickupType", "DropOffType", "ContinuousPickup", "ContinuousDropOff", "Timepoint", "StopTimes", "FareMedia", "FareTransferRules", "LocationGeoJson", "Transfers", "Levels", "Frequencies", "DirectionId", "WheelchairAccessible", "BikesAllowed", "Trips", "FareLegRules", "Attributions", "RouteNetworks", "Shapes", "Agency", "LocationGroupStores", "Areas", "RouteType", "Routes", "Networks", "FareRules", "Translations", "FeedInfo"] diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/agency.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/agency.py index 4cf5793..2b88a55 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/agency.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/agency.py @@ -32,7 +32,7 @@ class Agency: agencyLang: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyLang")) agencyPhone: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyPhone")) agencyFareUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyFareUrl")) - agencyEmail: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyEmail")) + agencyEmail: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyEmail")) def __post_init__(self): @@ -98,7 +98,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/areas.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/areas.py index 0b81bad..8b4ea6c 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/areas.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/areas.py @@ -24,7 +24,7 @@ class Areas: areaId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaId")) areaName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaName")) areaDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaDesc")) - areaUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaUrl")) + areaUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/attributions.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/attributions.py index 4cddbc1..47ad73f 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/attributions.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/attributions.py @@ -38,7 +38,7 @@ class Attributions: isAuthority: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="isAuthority")) attributionUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="attributionUrl")) attributionEmail: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="attributionEmail")) - attributionPhone: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="attributionPhone")) + attributionPhone: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="attributionPhone")) def __post_init__(self): @@ -107,7 +107,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/bookingrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/bookingrules.py index b96ed3c..8153388 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/bookingrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/bookingrules.py @@ -24,7 +24,7 @@ class BookingRules: bookingRuleId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bookingRuleId")) bookingRuleName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bookingRuleName")) bookingRuleDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bookingRuleDesc")) - bookingRuleUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bookingRuleUrl")) + bookingRuleUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bookingRuleUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendar.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendar.py index 725bfca..c8ee07b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendar.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendar.py @@ -37,7 +37,7 @@ class Calendar: saturday: ServiceAvailability=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="saturday")) sunday: ServiceAvailability=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="sunday")) startDate: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="startDate")) - endDate: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="endDate")) + endDate: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="endDate")) def __post_init__(self): @@ -105,7 +105,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendardates.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendardates.py index dbfe091..200fe66 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendardates.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/calendardates.py @@ -23,7 +23,7 @@ class CalendarDates: serviceId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="serviceId")) date: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="date")) - exceptionType: ExceptionType=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="exceptionType")) + exceptionType: ExceptionType=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="exceptionType")) def __post_init__(self): @@ -84,7 +84,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareattributes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareattributes.py index e6d4bb2..80b5596 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareattributes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareattributes.py @@ -30,7 +30,7 @@ class FareAttributes: paymentMethod: int=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="paymentMethod")) transfers: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="transfers")) agencyId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agencyId")) - transferDuration: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="transferDuration")) + transferDuration: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="transferDuration")) def __post_init__(self): @@ -95,7 +95,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farelegrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farelegrules.py index 3a173e3..ebba70e 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farelegrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farelegrules.py @@ -28,7 +28,7 @@ class FareLegRules: legGroupId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="legGroupId")) networkId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) fromAreaId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fromAreaId")) - toAreaId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="toAreaId")) + toAreaId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="toAreaId")) def __post_init__(self): @@ -92,7 +92,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faremedia.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faremedia.py index acf70f1..eafaf99 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faremedia.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faremedia.py @@ -24,7 +24,7 @@ class FareMedia: fareMediaId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareMediaId")) fareMediaName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareMediaName")) fareMediaDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareMediaDesc")) - fareMediaUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareMediaUrl")) + fareMediaUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareMediaUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareproducts.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareproducts.py index fde1df0..b773fde 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareproducts.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/fareproducts.py @@ -24,7 +24,7 @@ class FareProducts: fareProductId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareProductId")) fareProductName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareProductName")) fareProductDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareProductDesc")) - fareProductUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareProductUrl")) + fareProductUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fareProductUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farerules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farerules.py index f03ff71..4a3d188 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farerules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/farerules.py @@ -26,7 +26,7 @@ class FareRules: routeId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="routeId")) originId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="originId")) destinationId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="destinationId")) - containsId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="containsId")) + containsId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="containsId")) def __post_init__(self): @@ -89,7 +89,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faretransferrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faretransferrules.py index 6c20cf0..710158b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faretransferrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/faretransferrules.py @@ -30,7 +30,7 @@ class FareTransferRules: fromLegGroupId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fromLegGroupId")) toLegGroupId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="toLegGroupId")) duration: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="duration")) - durationType: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="durationType")) + durationType: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="durationType")) def __post_init__(self): @@ -95,7 +95,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/feedinfo.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/feedinfo.py index 9a1a8bf..55280ae 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/feedinfo.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/feedinfo.py @@ -34,7 +34,7 @@ class FeedInfo: feedEndDate: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="feedEndDate")) feedVersion: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="feedVersion")) feedContactEmail: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="feedContactEmail")) - feedContactUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="feedContactUrl")) + feedContactUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="feedContactUrl")) def __post_init__(self): @@ -101,7 +101,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/frequencies.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/frequencies.py index 414708b..f4cb8d1 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/frequencies.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/frequencies.py @@ -26,7 +26,7 @@ class Frequencies: startTime: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="startTime")) endTime: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="endTime")) headwaySecs: int=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="headwaySecs")) - exactTimes: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="exactTimes")) + exactTimes: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="exactTimes")) def __post_init__(self): @@ -89,7 +89,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/levels.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/levels.py index e36b7c3..1f1db60 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/levels.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/levels.py @@ -22,7 +22,7 @@ class Levels: levelId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="levelId")) levelIndex: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="levelIndex")) - levelName: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="levelName")) + levelName: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="levelName")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgeojson.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgeojson.py index d038868..53a2eb2 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgeojson.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgeojson.py @@ -22,7 +22,7 @@ class LocationGeoJson: locationGeoJsonId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGeoJsonId")) locationGeoJsonType: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGeoJsonType")) - locationGeoJsonData: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGeoJsonData")) + locationGeoJsonData: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGeoJsonData")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroups.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroups.py index a814ee0..4ca5baa 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroups.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroups.py @@ -24,7 +24,7 @@ class LocationGroups: locationGroupId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupId")) locationGroupName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupName")) locationGroupDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupDesc")) - locationGroupUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupUrl")) + locationGroupUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroupstores.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroupstores.py index e72b717..961a2c4 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroupstores.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/locationgroupstores.py @@ -22,7 +22,7 @@ class LocationGroupStores: locationGroupStoreId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupStoreId")) locationGroupId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="locationGroupId")) - storeId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="storeId")) + storeId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="storeId")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/networks.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/networks.py index 88961a4..75ce357 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/networks.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/networks.py @@ -24,7 +24,7 @@ class Networks: networkId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) networkName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkName")) networkDesc: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkDesc")) - networkUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkUrl")) + networkUrl: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkUrl")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/pathways.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/pathways.py index 1d5af5f..0e31b1b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/pathways.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/pathways.py @@ -40,7 +40,7 @@ class Pathways: maxSlope: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="maxSlope")) minWidth: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="minWidth")) signpostedAs: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="signpostedAs")) - reversedSignpostedAs: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="reversedSignpostedAs")) + reversedSignpostedAs: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="reversedSignpostedAs")) def __post_init__(self): @@ -110,7 +110,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routenetworks.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routenetworks.py index 4f793eb..9bf93c7 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routenetworks.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routenetworks.py @@ -22,7 +22,7 @@ class RouteNetworks: routeNetworkId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="routeNetworkId")) routeId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="routeId")) - networkId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) + networkId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routes.py index db6abef..04705b1 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/routes.py @@ -8,8 +8,8 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedstatic.routetype import RouteType from gtfs_rt_producer_data.generaltransitfeedstatic.continuousdropoff import ContinuousDropOff +from gtfs_rt_producer_data.generaltransitfeedstatic.routetype import RouteType from gtfs_rt_producer_data.generaltransitfeedstatic.continuouspickup import ContinuousPickup @@ -45,7 +45,7 @@ class Routes: routeSortOrder: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="routeSortOrder")) continuousPickup: ContinuousPickup=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="continuousPickup")) continuousDropOff: ContinuousDropOff=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="continuousDropOff")) - networkId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) + networkId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="networkId")) def __post_init__(self): @@ -116,7 +116,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/shapes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/shapes.py index 7bd825c..d31aea3 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/shapes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/shapes.py @@ -26,7 +26,7 @@ class Shapes: shapePtLat: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapePtLat")) shapePtLon: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapePtLon")) shapePtSequence: int=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapePtSequence")) - shapeDistTraveled: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapeDistTraveled")) + shapeDistTraveled: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapeDistTraveled")) def __post_init__(self): @@ -89,7 +89,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stopareas.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stopareas.py index 3f83899..9f504fb 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stopareas.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stopareas.py @@ -22,7 +22,7 @@ class StopAreas: stopAreaId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stopAreaId")) stopId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stopId")) - areaId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaId")) + areaId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="areaId")) def __post_init__(self): @@ -83,7 +83,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stops.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stops.py index fb08dc1..e5403a8 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stops.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stops.py @@ -8,8 +8,8 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedstatic.locationtype import LocationType from gtfs_rt_producer_data.generaltransitfeedstatic.wheelchairboarding import WheelchairBoarding +from gtfs_rt_producer_data.generaltransitfeedstatic.locationtype import LocationType @dataclasses_json.dataclass_json @@ -48,7 +48,7 @@ class Stops: stopTimezone: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stopTimezone")) wheelchairBoarding: WheelchairBoarding=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="wheelchairBoarding")) levelId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="levelId")) - platformCode: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="platformCode")) + platformCode: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="platformCode")) def __post_init__(self): @@ -121,7 +121,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stoptimes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stoptimes.py index 283146b..f2a3552 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stoptimes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/stoptimes.py @@ -8,9 +8,9 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedstatic.continuousdropoff import ContinuousDropOff from gtfs_rt_producer_data.generaltransitfeedstatic.timepoint import Timepoint from gtfs_rt_producer_data.generaltransitfeedstatic.pickuptype import PickupType +from gtfs_rt_producer_data.generaltransitfeedstatic.continuousdropoff import ContinuousDropOff from gtfs_rt_producer_data.generaltransitfeedstatic.dropofftype import DropOffType from gtfs_rt_producer_data.generaltransitfeedstatic.continuouspickup import ContinuousPickup @@ -45,7 +45,7 @@ class StopTimes: continuousPickup: typing.Optional[ContinuousPickup]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="continuousPickup")) continuousDropOff: typing.Optional[ContinuousDropOff]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="continuousDropOff")) shapeDistTraveled: typing.Optional[float]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapeDistTraveled")) - timepoint: Timepoint=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timepoint")) + timepoint: Timepoint=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timepoint")) def __post_init__(self): @@ -115,7 +115,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/timeframes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/timeframes.py index f3d2edc..c9d18a5 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/timeframes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/timeframes.py @@ -26,7 +26,7 @@ class Timeframes: timeframeGroupId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timeframeGroupId")) startTime: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="startTime")) endTime: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="endTime")) - serviceDates: typing.Union[Calendar, CalendarDates]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="serviceDates")) + serviceDates: typing.Union[Calendar, CalendarDates]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="serviceDates")) def __post_init__(self): @@ -88,7 +88,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/transfers.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/transfers.py index 75571a6..cfc7994 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/transfers.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/transfers.py @@ -24,7 +24,7 @@ class Transfers: fromStopId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fromStopId")) toStopId: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="toStopId")) transferType: int=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="transferType")) - minTransferTime: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="minTransferTime")) + minTransferTime: typing.Optional[int]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="minTransferTime")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/translations.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/translations.py index 271ecf3..b310b1d 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/translations.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/translations.py @@ -24,7 +24,7 @@ class Translations: tableName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="tableName")) fieldName: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="fieldName")) language: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="language")) - translation: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="translation")) + translation: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="translation")) def __post_init__(self): @@ -86,7 +86,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/trips.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/trips.py index 5582cb7..3112e92 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/trips.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/src/gtfs_rt_producer_data/generaltransitfeedstatic/trips.py @@ -8,11 +8,11 @@ import dataclasses import dataclasses_json import json -from gtfs_rt_producer_data.generaltransitfeedstatic.calendar import Calendar -from gtfs_rt_producer_data.generaltransitfeedstatic.calendardates import CalendarDates from gtfs_rt_producer_data.generaltransitfeedstatic.wheelchairaccessible import WheelchairAccessible +from gtfs_rt_producer_data.generaltransitfeedstatic.calendar import Calendar from gtfs_rt_producer_data.generaltransitfeedstatic.directionid import DirectionId from gtfs_rt_producer_data.generaltransitfeedstatic.bikesallowed import BikesAllowed +from gtfs_rt_producer_data.generaltransitfeedstatic.calendardates import CalendarDates @dataclasses_json.dataclass_json @@ -43,7 +43,7 @@ class Trips: blockId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="blockId")) shapeId: typing.Optional[str]=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shapeId")) wheelchairAccessible: WheelchairAccessible=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="wheelchairAccessible")) - bikesAllowed: BikesAllowed=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bikesAllowed")) + bikesAllowed: BikesAllowed=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="bikesAllowed")) def __post_init__(self): @@ -113,7 +113,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert.py index 6fb8042..dba7403 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert.py @@ -9,12 +9,13 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.alert import Alert -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector import Test_EntitySelector +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert_types_cause import Test_Cause from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert_types_effect import Test_Effect from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_timerange import Test_TimeRange -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_alert_types_cause import Test_Cause +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector import Test_EntitySelector from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring import Test_TranslatedString + class Test_Alert(unittest.TestCase): """ Test case for Alert @@ -32,8 +33,8 @@ def create_instance(): Create instance of Alert for testing """ instance = Alert( - active_period=[Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance()], - informed_entity=[Test_EntitySelector.create_instance()], + active_period=[Test_TimeRange.create_instance(), Test_TimeRange.create_instance()], + informed_entity=[Test_EntitySelector.create_instance(), Test_EntitySelector.create_instance()], cause=Test_Cause.create_instance(), effect=Test_Effect.create_instance(), url=Test_TranslatedString.create_instance(), @@ -47,7 +48,7 @@ def test_active_period_property(self): """ Test active_period property """ - test_value = [Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance(), Test_TimeRange.create_instance()] + test_value = [Test_TimeRange.create_instance(), Test_TimeRange.create_instance()] self.instance.active_period = test_value self.assertEqual(self.instance.active_period, test_value) @@ -55,7 +56,7 @@ def test_informed_entity_property(self): """ Test informed_entity property """ - test_value = [Test_EntitySelector.create_instance()] + test_value = [Test_EntitySelector.create_instance(), Test_EntitySelector.create_instance()] self.instance.informed_entity = test_value self.assertEqual(self.instance.informed_entity, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector.py index a6b7547..b41c26f 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_entityselector.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.entityselector import EntitySelector from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor import Test_TripDescriptor + class Test_EntitySelector(unittest.TestCase): """ Test case for EntitySelector @@ -28,11 +29,11 @@ def create_instance(): Create instance of EntitySelector for testing """ instance = EntitySelector( - agency_id='lszsmxfbnhhvccyoqfjm', - route_id='tfefgldayootbvenduxj', - route_type=int(30), + agency_id='gxcnksmgblqlynmibyyf', + route_id='pjhgaszrnyggundlzldl', + route_type=int(81), trip=Test_TripDescriptor.create_instance(), - stop_id='kisllbnvcmtutgojincv' + stop_id='ivuadfbbmpwgjzbxoavh' ) return instance @@ -41,7 +42,7 @@ def test_agency_id_property(self): """ Test agency_id property """ - test_value = 'lszsmxfbnhhvccyoqfjm' + test_value = 'gxcnksmgblqlynmibyyf' self.instance.agency_id = test_value self.assertEqual(self.instance.agency_id, test_value) @@ -49,7 +50,7 @@ def test_route_id_property(self): """ Test route_id property """ - test_value = 'tfefgldayootbvenduxj' + test_value = 'pjhgaszrnyggundlzldl' self.instance.route_id = test_value self.assertEqual(self.instance.route_id, test_value) @@ -57,7 +58,7 @@ def test_route_type_property(self): """ Test route_type property """ - test_value = int(30) + test_value = int(81) self.instance.route_type = test_value self.assertEqual(self.instance.route_type, test_value) @@ -73,7 +74,7 @@ def test_stop_id_property(self): """ Test stop_id property """ - test_value = 'kisllbnvcmtutgojincv' + test_value = 'ivuadfbbmpwgjzbxoavh' self.instance.stop_id = test_value self.assertEqual(self.instance.stop_id, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_timerange.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_timerange.py index 01ba3c5..72ecf77 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_timerange.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_timerange.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.timerange import TimeRange + class Test_TimeRange(unittest.TestCase): """ Test case for TimeRange @@ -27,8 +28,8 @@ def create_instance(): Create instance of TimeRange for testing """ instance = TimeRange( - start=int(66), - end=int(89) + start=int(39), + end=int(4) ) return instance @@ -37,7 +38,7 @@ def test_start_property(self): """ Test start property """ - test_value = int(66) + test_value = int(39) self.instance.start = test_value self.assertEqual(self.instance.start, test_value) @@ -45,7 +46,7 @@ def test_end_property(self): """ Test end property """ - test_value = int(89) + test_value = int(4) self.instance.end = test_value self.assertEqual(self.instance.end, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring.py index 5c189e0..bad35bb 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.translatedstring import TranslatedString from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring_types_translation import Test_Translation + class Test_TranslatedString(unittest.TestCase): """ Test case for TranslatedString @@ -28,7 +29,7 @@ def create_instance(): Create instance of TranslatedString for testing """ instance = TranslatedString( - translation=[Test_Translation.create_instance()] + translation=[Test_Translation.create_instance(), Test_Translation.create_instance(), Test_Translation.create_instance()] ) return instance @@ -37,7 +38,7 @@ def test_translation_property(self): """ Test translation property """ - test_value = [Test_Translation.create_instance()] + test_value = [Test_Translation.create_instance(), Test_Translation.create_instance(), Test_Translation.create_instance()] self.instance.translation = test_value self.assertEqual(self.instance.translation, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring_types_translation.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring_types_translation.py index 392d754..3eacd65 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring_types_translation.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_translatedstring_types_translation.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.translatedstring_types.translation import Translation + class Test_Translation(unittest.TestCase): """ Test case for Translation @@ -27,8 +28,8 @@ def create_instance(): Create instance of Translation for testing """ instance = Translation( - text='rmeprxytceqskgixpcho', - language='vqlrutmqpsdpoxltxqfp' + text='fyaudzmlmrtsxjpufzel', + language='dtszasjvkliuszcfwkzi' ) return instance @@ -37,7 +38,7 @@ def test_text_property(self): """ Test text property """ - test_value = 'rmeprxytceqskgixpcho' + test_value = 'fyaudzmlmrtsxjpufzel' self.instance.text = test_value self.assertEqual(self.instance.text, test_value) @@ -45,7 +46,7 @@ def test_language_property(self): """ Test language property """ - test_value = 'vqlrutmqpsdpoxltxqfp' + test_value = 'dtszasjvkliuszcfwkzi' self.instance.language = test_value self.assertEqual(self.instance.language, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor.py index b7668f9..ecd39b9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.alert.tripdescriptor import TripDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_alert_tripdescriptor_types_schedulerelationship import Test_ScheduleRelationship + class Test_TripDescriptor(unittest.TestCase): """ Test case for TripDescriptor @@ -28,11 +29,11 @@ def create_instance(): Create instance of TripDescriptor for testing """ instance = TripDescriptor( - trip_id='qglsupoexwacuytlexbk', - route_id='jxpiqhkvbyozrnsecluk', - direction_id=int(85), - start_time='kwzqdkixaqlogxptlato', - start_date='leqhbdjoqzhbtwnptzxd', + trip_id='dtvpysbcfhhlafjfjrkv', + route_id='ilbzppzemojihpcjzhiw', + direction_id=int(26), + start_time='hehbuquskidywfcuaqys', + start_date='evlumeaomjaqbrgpydjw', schedule_relationship=Test_ScheduleRelationship.create_instance() ) return instance @@ -42,7 +43,7 @@ def test_trip_id_property(self): """ Test trip_id property """ - test_value = 'qglsupoexwacuytlexbk' + test_value = 'dtvpysbcfhhlafjfjrkv' self.instance.trip_id = test_value self.assertEqual(self.instance.trip_id, test_value) @@ -50,7 +51,7 @@ def test_route_id_property(self): """ Test route_id property """ - test_value = 'jxpiqhkvbyozrnsecluk' + test_value = 'ilbzppzemojihpcjzhiw' self.instance.route_id = test_value self.assertEqual(self.instance.route_id, test_value) @@ -58,7 +59,7 @@ def test_direction_id_property(self): """ Test direction_id property """ - test_value = int(85) + test_value = int(26) self.instance.direction_id = test_value self.assertEqual(self.instance.direction_id, test_value) @@ -66,7 +67,7 @@ def test_start_time_property(self): """ Test start_time property """ - test_value = 'kwzqdkixaqlogxptlato' + test_value = 'hehbuquskidywfcuaqys' self.instance.start_time = test_value self.assertEqual(self.instance.start_time, test_value) @@ -74,7 +75,7 @@ def test_start_date_property(self): """ Test start_date property """ - test_value = 'leqhbdjoqzhbtwnptzxd' + test_value = 'evlumeaomjaqbrgpydjw' self.instance.start_date = test_value self.assertEqual(self.instance.start_date, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor.py index 5099fbc..94d7207 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripdescriptor import TripDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor_types_schedulerelationship import Test_ScheduleRelationship + class Test_TripDescriptor(unittest.TestCase): """ Test case for TripDescriptor @@ -28,11 +29,11 @@ def create_instance(): Create instance of TripDescriptor for testing """ instance = TripDescriptor( - trip_id='lsgpttqfanckpgtseveb', - route_id='thxjemzilbovxjncgzio', - direction_id=int(26), - start_time='ogxjzkildrujoydterqd', - start_date='cfqeiseacbwrszivpyeb', + trip_id='exlfkflxzdftsmiuuslf', + route_id='pnqrrjzenysnxdcbuhhv', + direction_id=int(35), + start_time='hlqtxokxhzjmaiyxcsix', + start_date='ittpkxlmbsknyaoroila', schedule_relationship=Test_ScheduleRelationship.create_instance() ) return instance @@ -42,7 +43,7 @@ def test_trip_id_property(self): """ Test trip_id property """ - test_value = 'lsgpttqfanckpgtseveb' + test_value = 'exlfkflxzdftsmiuuslf' self.instance.trip_id = test_value self.assertEqual(self.instance.trip_id, test_value) @@ -50,7 +51,7 @@ def test_route_id_property(self): """ Test route_id property """ - test_value = 'thxjemzilbovxjncgzio' + test_value = 'pnqrrjzenysnxdcbuhhv' self.instance.route_id = test_value self.assertEqual(self.instance.route_id, test_value) @@ -58,7 +59,7 @@ def test_direction_id_property(self): """ Test direction_id property """ - test_value = int(26) + test_value = int(35) self.instance.direction_id = test_value self.assertEqual(self.instance.direction_id, test_value) @@ -66,7 +67,7 @@ def test_start_time_property(self): """ Test start_time property """ - test_value = 'ogxjzkildrujoydterqd' + test_value = 'hlqtxokxhzjmaiyxcsix' self.instance.start_time = test_value self.assertEqual(self.instance.start_time, test_value) @@ -74,7 +75,7 @@ def test_start_date_property(self): """ Test start_date property """ - test_value = 'cfqeiseacbwrszivpyeb' + test_value = 'ittpkxlmbsknyaoroila' self.instance.start_date = test_value self.assertEqual(self.instance.start_date, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate.py index f28bef7..1fb2c1d 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate.py @@ -9,9 +9,10 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate import TripUpdate -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor import Test_VehicleDescriptor -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate import Test_StopTimeUpdate from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripdescriptor import Test_TripDescriptor +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate import Test_StopTimeUpdate +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor import Test_VehicleDescriptor + class Test_TripUpdate(unittest.TestCase): """ @@ -33,8 +34,8 @@ def create_instance(): trip=Test_TripDescriptor.create_instance(), vehicle=Test_VehicleDescriptor.create_instance(), stop_time_update=[Test_StopTimeUpdate.create_instance(), Test_StopTimeUpdate.create_instance(), Test_StopTimeUpdate.create_instance(), Test_StopTimeUpdate.create_instance(), Test_StopTimeUpdate.create_instance()], - timestamp=int(81), - delay=int(7) + timestamp=int(56), + delay=int(69) ) return instance @@ -67,7 +68,7 @@ def test_timestamp_property(self): """ Test timestamp property """ - test_value = int(81) + test_value = int(56) self.instance.timestamp = test_value self.assertEqual(self.instance.timestamp, test_value) @@ -75,7 +76,7 @@ def test_delay_property(self): """ Test delay property """ - test_value = int(7) + test_value = int(69) self.instance.delay = test_value self.assertEqual(self.instance.delay, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent.py index e11ecd8..866560a 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeevent import StopTimeEvent + class Test_StopTimeEvent(unittest.TestCase): """ Test case for StopTimeEvent @@ -27,9 +28,9 @@ def create_instance(): Create instance of StopTimeEvent for testing """ instance = StopTimeEvent( - delay=int(77), - time=int(73), - uncertainty=int(91) + delay=int(50), + time=int(44), + uncertainty=int(4) ) return instance @@ -38,7 +39,7 @@ def test_delay_property(self): """ Test delay property """ - test_value = int(77) + test_value = int(50) self.instance.delay = test_value self.assertEqual(self.instance.delay, test_value) @@ -46,7 +47,7 @@ def test_time_property(self): """ Test time property """ - test_value = int(73) + test_value = int(44) self.instance.time = test_value self.assertEqual(self.instance.time, test_value) @@ -54,7 +55,7 @@ def test_uncertainty_property(self): """ Test uncertainty property """ - test_value = int(91) + test_value = int(4) self.instance.uncertainty = test_value self.assertEqual(self.instance.uncertainty, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate.py index 76ed8b5..ad4a5da 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate.py @@ -9,8 +9,9 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.tripupdate_types.stoptimeupdate import StopTimeUpdate -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent import Test_StopTimeEvent from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeupdate_types_schedulerelationship import Test_ScheduleRelationship +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_tripupdate_types_stoptimeevent import Test_StopTimeEvent + class Test_StopTimeUpdate(unittest.TestCase): """ @@ -29,8 +30,8 @@ def create_instance(): Create instance of StopTimeUpdate for testing """ instance = StopTimeUpdate( - stop_sequence=int(77), - stop_id='mhadwyaehvdwpunlecmj', + stop_sequence=int(99), + stop_id='vzfroynyusdrmcgrqslj', arrival=Test_StopTimeEvent.create_instance(), departure=Test_StopTimeEvent.create_instance(), schedule_relationship=Test_ScheduleRelationship.create_instance() @@ -42,7 +43,7 @@ def test_stop_sequence_property(self): """ Test stop_sequence property """ - test_value = int(77) + test_value = int(99) self.instance.stop_sequence = test_value self.assertEqual(self.instance.stop_sequence, test_value) @@ -50,7 +51,7 @@ def test_stop_id_property(self): """ Test stop_id property """ - test_value = 'mhadwyaehvdwpunlecmj' + test_value = 'vzfroynyusdrmcgrqslj' self.instance.stop_id = test_value self.assertEqual(self.instance.stop_id, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor.py index 330032e..439cc3d 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_trip_vehicledescriptor.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.trip.vehicledescriptor import VehicleDescriptor + class Test_VehicleDescriptor(unittest.TestCase): """ Test case for VehicleDescriptor @@ -27,9 +28,9 @@ def create_instance(): Create instance of VehicleDescriptor for testing """ instance = VehicleDescriptor( - id='mhpdutwcrvmrjqkfzutr', - label='ziryuyksmevknkxgrswq', - license_plate='rpzkenkinlknbrojxtex' + id='loaflhgdmostltmohnqx', + label='egblyxgvvtdgxvennuej', + license_plate='vpbbrramckfrachdjcth' ) return instance @@ -38,7 +39,7 @@ def test_id_property(self): """ Test id property """ - test_value = 'mhpdutwcrvmrjqkfzutr' + test_value = 'loaflhgdmostltmohnqx' self.instance.id = test_value self.assertEqual(self.instance.id, test_value) @@ -46,7 +47,7 @@ def test_label_property(self): """ Test label property """ - test_value = 'ziryuyksmevknkxgrswq' + test_value = 'egblyxgvvtdgxvennuej' self.instance.label = test_value self.assertEqual(self.instance.label, test_value) @@ -54,7 +55,7 @@ def test_license_plate_property(self): """ Test license_plate property """ - test_value = 'rpzkenkinlknbrojxtex' + test_value = 'vpbbrramckfrachdjcth' self.instance.license_plate = test_value self.assertEqual(self.instance.license_plate, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_position.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_position.py index 6a43e39..bd0131a 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_position.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_position.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.position import Position + class Test_Position(unittest.TestCase): """ Test case for Position @@ -27,11 +28,11 @@ def create_instance(): Create instance of Position for testing """ instance = Position( - latitude=float(74.81703275295268), - longitude=float(54.87945296095138), - bearing=float(8.010107073749584), - odometer=float(54.652392261686266), - speed=float(82.25027282559265) + latitude=float(59.0878001952112), + longitude=float(89.51989918184016), + bearing=float(73.14224043422124), + odometer=float(74.87562365214251), + speed=float(29.87618498002088) ) return instance @@ -40,7 +41,7 @@ def test_latitude_property(self): """ Test latitude property """ - test_value = float(74.81703275295268) + test_value = float(59.0878001952112) self.instance.latitude = test_value self.assertEqual(self.instance.latitude, test_value) @@ -48,7 +49,7 @@ def test_longitude_property(self): """ Test longitude property """ - test_value = float(54.87945296095138) + test_value = float(89.51989918184016) self.instance.longitude = test_value self.assertEqual(self.instance.longitude, test_value) @@ -56,7 +57,7 @@ def test_bearing_property(self): """ Test bearing property """ - test_value = float(8.010107073749584) + test_value = float(73.14224043422124) self.instance.bearing = test_value self.assertEqual(self.instance.bearing, test_value) @@ -64,7 +65,7 @@ def test_odometer_property(self): """ Test odometer property """ - test_value = float(54.652392261686266) + test_value = float(74.87562365214251) self.instance.odometer = test_value self.assertEqual(self.instance.odometer, test_value) @@ -72,7 +73,7 @@ def test_speed_property(self): """ Test speed property """ - test_value = float(82.25027282559265) + test_value = float(29.87618498002088) self.instance.speed = test_value self.assertEqual(self.instance.speed, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor.py index 5a5558d..983c5b8 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.tripdescriptor import TripDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor_types_schedulerelationship import Test_ScheduleRelationship + class Test_TripDescriptor(unittest.TestCase): """ Test case for TripDescriptor @@ -28,11 +29,11 @@ def create_instance(): Create instance of TripDescriptor for testing """ instance = TripDescriptor( - trip_id='cenkcouwatkqismwclkg', - route_id='xcbubnxdvyntxzhyqhlb', - direction_id=int(77), - start_time='zbbranwqfukrgdvtvyou', - start_date='prrfvjylxwrjtlfdpfwt', + trip_id='yyedpaxxdxbgsqkpvxcq', + route_id='rdlfvzpmrfspcdqtmmry', + direction_id=int(47), + start_time='pvsoxocwovdmllpbjmwd', + start_date='bdyjnyxldojzjamfueqn', schedule_relationship=Test_ScheduleRelationship.create_instance() ) return instance @@ -42,7 +43,7 @@ def test_trip_id_property(self): """ Test trip_id property """ - test_value = 'cenkcouwatkqismwclkg' + test_value = 'yyedpaxxdxbgsqkpvxcq' self.instance.trip_id = test_value self.assertEqual(self.instance.trip_id, test_value) @@ -50,7 +51,7 @@ def test_route_id_property(self): """ Test route_id property """ - test_value = 'xcbubnxdvyntxzhyqhlb' + test_value = 'rdlfvzpmrfspcdqtmmry' self.instance.route_id = test_value self.assertEqual(self.instance.route_id, test_value) @@ -58,7 +59,7 @@ def test_direction_id_property(self): """ Test direction_id property """ - test_value = int(77) + test_value = int(47) self.instance.direction_id = test_value self.assertEqual(self.instance.direction_id, test_value) @@ -66,7 +67,7 @@ def test_start_time_property(self): """ Test start_time property """ - test_value = 'zbbranwqfukrgdvtvyou' + test_value = 'pvsoxocwovdmllpbjmwd' self.instance.start_time = test_value self.assertEqual(self.instance.start_time, test_value) @@ -74,7 +75,7 @@ def test_start_date_property(self): """ Test start_date property """ - test_value = 'prrfvjylxwrjtlfdpfwt' + test_value = 'bdyjnyxldojzjamfueqn' self.instance.start_date = test_value self.assertEqual(self.instance.start_date, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor.py index b9d1885..4ef5a3b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicledescriptor import VehicleDescriptor + class Test_VehicleDescriptor(unittest.TestCase): """ Test case for VehicleDescriptor @@ -27,9 +28,9 @@ def create_instance(): Create instance of VehicleDescriptor for testing """ instance = VehicleDescriptor( - id='iepxwyvqnkyagvgagrhl', - label='mlfgvmpwzwmrmkrmevar', - license_plate='rxuhtlotyhzueyknaozu' + id='okddyrunbnxkqcjujinc', + label='txtxsibzfhyymiidejdh', + license_plate='gqnzjpqppqmexsslbydp' ) return instance @@ -38,7 +39,7 @@ def test_id_property(self): """ Test id property """ - test_value = 'iepxwyvqnkyagvgagrhl' + test_value = 'okddyrunbnxkqcjujinc' self.instance.id = test_value self.assertEqual(self.instance.id, test_value) @@ -46,7 +47,7 @@ def test_label_property(self): """ Test label property """ - test_value = 'mlfgvmpwzwmrmkrmevar' + test_value = 'txtxsibzfhyymiidejdh' self.instance.label = test_value self.assertEqual(self.instance.label, test_value) @@ -54,7 +55,7 @@ def test_license_plate_property(self): """ Test license_plate property """ - test_value = 'rxuhtlotyhzueyknaozu' + test_value = 'gqnzjpqppqmexsslbydp' self.instance.license_plate = test_value self.assertEqual(self.instance.license_plate, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition.py index c2df03d..5dc27b9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition.py @@ -10,12 +10,13 @@ from gtfs_rt_producer_data.generaltransitfeedrealtime.vehicle.vehicleposition import VehiclePosition from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition_types_congestionlevel import Test_CongestionLevel +from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor import Test_VehicleDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_tripdescriptor import Test_TripDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition_types_occupancystatus import Test_OccupancyStatus from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicleposition_types_vehiclestopstatus import Test_VehicleStopStatus -from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_vehicledescriptor import Test_VehicleDescriptor from test_gtfs_rt_producer_data_generaltransitfeedrealtime_vehicle_position import Test_Position + class Test_VehiclePosition(unittest.TestCase): """ Test case for VehiclePosition @@ -36,10 +37,10 @@ def create_instance(): trip=Test_TripDescriptor.create_instance(), vehicle=Test_VehicleDescriptor.create_instance(), position=Test_Position.create_instance(), - current_stop_sequence=int(90), - stop_id='ghejprvirjtuascktchr', + current_stop_sequence=int(30), + stop_id='fjtujquvgklgvrbqcbvs', current_status=Test_VehicleStopStatus.create_instance(), - timestamp=int(71), + timestamp=int(11), congestion_level=Test_CongestionLevel.create_instance(), occupancy_status=Test_OccupancyStatus.create_instance() ) @@ -74,7 +75,7 @@ def test_current_stop_sequence_property(self): """ Test current_stop_sequence property """ - test_value = int(90) + test_value = int(30) self.instance.current_stop_sequence = test_value self.assertEqual(self.instance.current_stop_sequence, test_value) @@ -82,7 +83,7 @@ def test_stop_id_property(self): """ Test stop_id property """ - test_value = 'ghejprvirjtuascktchr' + test_value = 'fjtujquvgklgvrbqcbvs' self.instance.stop_id = test_value self.assertEqual(self.instance.stop_id, test_value) @@ -98,7 +99,7 @@ def test_timestamp_property(self): """ Test timestamp property """ - test_value = int(71) + test_value = int(11) self.instance.timestamp = test_value self.assertEqual(self.instance.timestamp, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_agency.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_agency.py index ec38db9..9c19d83 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_agency.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_agency.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.agency import Agency + class Test_Agency(unittest.TestCase): """ Test case for Agency @@ -27,14 +28,14 @@ def create_instance(): Create instance of Agency for testing """ instance = Agency( - agencyId='exlkzimboltqsqydmlvy', - agencyName='rvffzmgzqxodrdtzhpsl', - agencyUrl='cpjnlztgmxqrybozbxqn', - agencyTimezone='mxeqmtkttoqrpaisxtiu', - agencyLang='msyhoinufaehtomkvfqt', - agencyPhone='plmvifljzvphtcftmrvu', - agencyFareUrl='atnqnmbycyifvkzqlivd', - agencyEmail='iydsmvsjtzggjqrifrvc' + agencyId='wtyymahbtvjhtywoiqbw', + agencyName='wlywtzyytvvfvjiafusy', + agencyUrl='ngfeeryjjupnkdhnjnyw', + agencyTimezone='rdyfaiodkqhpqrbolkhe', + agencyLang='jgwuxbnecqdujuvhphph', + agencyPhone='baknryrubytimhqnnmyl', + agencyFareUrl='embelkfvpuzwuvhkpbxv', + agencyEmail='ajxslnhqhrqqknhccbcq' ) return instance @@ -43,7 +44,7 @@ def test_agencyId_property(self): """ Test agencyId property """ - test_value = 'exlkzimboltqsqydmlvy' + test_value = 'wtyymahbtvjhtywoiqbw' self.instance.agencyId = test_value self.assertEqual(self.instance.agencyId, test_value) @@ -51,7 +52,7 @@ def test_agencyName_property(self): """ Test agencyName property """ - test_value = 'rvffzmgzqxodrdtzhpsl' + test_value = 'wlywtzyytvvfvjiafusy' self.instance.agencyName = test_value self.assertEqual(self.instance.agencyName, test_value) @@ -59,7 +60,7 @@ def test_agencyUrl_property(self): """ Test agencyUrl property """ - test_value = 'cpjnlztgmxqrybozbxqn' + test_value = 'ngfeeryjjupnkdhnjnyw' self.instance.agencyUrl = test_value self.assertEqual(self.instance.agencyUrl, test_value) @@ -67,7 +68,7 @@ def test_agencyTimezone_property(self): """ Test agencyTimezone property """ - test_value = 'mxeqmtkttoqrpaisxtiu' + test_value = 'rdyfaiodkqhpqrbolkhe' self.instance.agencyTimezone = test_value self.assertEqual(self.instance.agencyTimezone, test_value) @@ -75,7 +76,7 @@ def test_agencyLang_property(self): """ Test agencyLang property """ - test_value = 'msyhoinufaehtomkvfqt' + test_value = 'jgwuxbnecqdujuvhphph' self.instance.agencyLang = test_value self.assertEqual(self.instance.agencyLang, test_value) @@ -83,7 +84,7 @@ def test_agencyPhone_property(self): """ Test agencyPhone property """ - test_value = 'plmvifljzvphtcftmrvu' + test_value = 'baknryrubytimhqnnmyl' self.instance.agencyPhone = test_value self.assertEqual(self.instance.agencyPhone, test_value) @@ -91,7 +92,7 @@ def test_agencyFareUrl_property(self): """ Test agencyFareUrl property """ - test_value = 'atnqnmbycyifvkzqlivd' + test_value = 'embelkfvpuzwuvhkpbxv' self.instance.agencyFareUrl = test_value self.assertEqual(self.instance.agencyFareUrl, test_value) @@ -99,7 +100,7 @@ def test_agencyEmail_property(self): """ Test agencyEmail property """ - test_value = 'iydsmvsjtzggjqrifrvc' + test_value = 'ajxslnhqhrqqknhccbcq' self.instance.agencyEmail = test_value self.assertEqual(self.instance.agencyEmail, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_areas.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_areas.py index b87cde8..1e15d44 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_areas.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_areas.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.areas import Areas + class Test_Areas(unittest.TestCase): """ Test case for Areas @@ -27,10 +28,10 @@ def create_instance(): Create instance of Areas for testing """ instance = Areas( - areaId='vyjdvufplupeyguygkqr', - areaName='fqcppadxpulxkukvwark', - areaDesc='tdncnceliydnpajvhffo', - areaUrl='nsulfjbpbdahhneiucqh' + areaId='nvurooowwymloithehrg', + areaName='kfnkbhcowrazdtpkgldo', + areaDesc='jolqlthepdnfuidtfaee', + areaUrl='eatgptubkpcgscefipaf' ) return instance @@ -39,7 +40,7 @@ def test_areaId_property(self): """ Test areaId property """ - test_value = 'vyjdvufplupeyguygkqr' + test_value = 'nvurooowwymloithehrg' self.instance.areaId = test_value self.assertEqual(self.instance.areaId, test_value) @@ -47,7 +48,7 @@ def test_areaName_property(self): """ Test areaName property """ - test_value = 'fqcppadxpulxkukvwark' + test_value = 'kfnkbhcowrazdtpkgldo' self.instance.areaName = test_value self.assertEqual(self.instance.areaName, test_value) @@ -55,7 +56,7 @@ def test_areaDesc_property(self): """ Test areaDesc property """ - test_value = 'tdncnceliydnpajvhffo' + test_value = 'jolqlthepdnfuidtfaee' self.instance.areaDesc = test_value self.assertEqual(self.instance.areaDesc, test_value) @@ -63,7 +64,7 @@ def test_areaUrl_property(self): """ Test areaUrl property """ - test_value = 'nsulfjbpbdahhneiucqh' + test_value = 'eatgptubkpcgscefipaf' self.instance.areaUrl = test_value self.assertEqual(self.instance.areaUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_attributions.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_attributions.py index ec6e73b..278bb13 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_attributions.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_attributions.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.attributions import Attributions + class Test_Attributions(unittest.TestCase): """ Test case for Attributions @@ -27,17 +28,17 @@ def create_instance(): Create instance of Attributions for testing """ instance = Attributions( - attributionId='adhcpfqtmffgnegtkrfq', - agencyId='tixvweuaacblcydkcpss', - routeId='qnnyynrpkuhvcoqmqqmp', - tripId='pgmecixzbjsncnvanqwf', - organizationName='vafalxnzkzlwavmalngt', - isProducer=int(32), - isOperator=int(40), - isAuthority=int(26), - attributionUrl='twushktonkmvlorzsljm', - attributionEmail='myjxwsrbieqgxhqpbgee', - attributionPhone='htejliwekbcximstadon' + attributionId='bwptbceykmeixyluciui', + agencyId='obqqopoqsmutelnzkvgm', + routeId='olqziqolzsvgurmawyed', + tripId='fdxieozjszvvzwicznrt', + organizationName='ddvphbfacmlprxzhjyto', + isProducer=int(53), + isOperator=int(88), + isAuthority=int(81), + attributionUrl='vrqssplljvouxhqvhyni', + attributionEmail='tfninppmkhstvmumctxo', + attributionPhone='sxohfhcgsnrjjczyhjrx' ) return instance @@ -46,7 +47,7 @@ def test_attributionId_property(self): """ Test attributionId property """ - test_value = 'adhcpfqtmffgnegtkrfq' + test_value = 'bwptbceykmeixyluciui' self.instance.attributionId = test_value self.assertEqual(self.instance.attributionId, test_value) @@ -54,7 +55,7 @@ def test_agencyId_property(self): """ Test agencyId property """ - test_value = 'tixvweuaacblcydkcpss' + test_value = 'obqqopoqsmutelnzkvgm' self.instance.agencyId = test_value self.assertEqual(self.instance.agencyId, test_value) @@ -62,7 +63,7 @@ def test_routeId_property(self): """ Test routeId property """ - test_value = 'qnnyynrpkuhvcoqmqqmp' + test_value = 'olqziqolzsvgurmawyed' self.instance.routeId = test_value self.assertEqual(self.instance.routeId, test_value) @@ -70,7 +71,7 @@ def test_tripId_property(self): """ Test tripId property """ - test_value = 'pgmecixzbjsncnvanqwf' + test_value = 'fdxieozjszvvzwicznrt' self.instance.tripId = test_value self.assertEqual(self.instance.tripId, test_value) @@ -78,7 +79,7 @@ def test_organizationName_property(self): """ Test organizationName property """ - test_value = 'vafalxnzkzlwavmalngt' + test_value = 'ddvphbfacmlprxzhjyto' self.instance.organizationName = test_value self.assertEqual(self.instance.organizationName, test_value) @@ -86,7 +87,7 @@ def test_isProducer_property(self): """ Test isProducer property """ - test_value = int(32) + test_value = int(53) self.instance.isProducer = test_value self.assertEqual(self.instance.isProducer, test_value) @@ -94,7 +95,7 @@ def test_isOperator_property(self): """ Test isOperator property """ - test_value = int(40) + test_value = int(88) self.instance.isOperator = test_value self.assertEqual(self.instance.isOperator, test_value) @@ -102,7 +103,7 @@ def test_isAuthority_property(self): """ Test isAuthority property """ - test_value = int(26) + test_value = int(81) self.instance.isAuthority = test_value self.assertEqual(self.instance.isAuthority, test_value) @@ -110,7 +111,7 @@ def test_attributionUrl_property(self): """ Test attributionUrl property """ - test_value = 'twushktonkmvlorzsljm' + test_value = 'vrqssplljvouxhqvhyni' self.instance.attributionUrl = test_value self.assertEqual(self.instance.attributionUrl, test_value) @@ -118,7 +119,7 @@ def test_attributionEmail_property(self): """ Test attributionEmail property """ - test_value = 'myjxwsrbieqgxhqpbgee' + test_value = 'tfninppmkhstvmumctxo' self.instance.attributionEmail = test_value self.assertEqual(self.instance.attributionEmail, test_value) @@ -126,7 +127,7 @@ def test_attributionPhone_property(self): """ Test attributionPhone property """ - test_value = 'htejliwekbcximstadon' + test_value = 'sxohfhcgsnrjjczyhjrx' self.instance.attributionPhone = test_value self.assertEqual(self.instance.attributionPhone, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_bookingrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_bookingrules.py index d602007..89ea23c 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_bookingrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_bookingrules.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.bookingrules import BookingRules + class Test_BookingRules(unittest.TestCase): """ Test case for BookingRules @@ -27,10 +28,10 @@ def create_instance(): Create instance of BookingRules for testing """ instance = BookingRules( - bookingRuleId='voaymqakzvuiyeydoneb', - bookingRuleName='ytmnvmzmruyvzlsyezia', - bookingRuleDesc='fhvqosoxywtxphunncys', - bookingRuleUrl='swxcvgjdsdxkrdumktbj' + bookingRuleId='cweemwipqlfcdndyrhtm', + bookingRuleName='lgstdiptegarzvlaibtr', + bookingRuleDesc='ndnhusmbnhsfwyjtlutr', + bookingRuleUrl='noivxplxphhffkswhrsu' ) return instance @@ -39,7 +40,7 @@ def test_bookingRuleId_property(self): """ Test bookingRuleId property """ - test_value = 'voaymqakzvuiyeydoneb' + test_value = 'cweemwipqlfcdndyrhtm' self.instance.bookingRuleId = test_value self.assertEqual(self.instance.bookingRuleId, test_value) @@ -47,7 +48,7 @@ def test_bookingRuleName_property(self): """ Test bookingRuleName property """ - test_value = 'ytmnvmzmruyvzlsyezia' + test_value = 'lgstdiptegarzvlaibtr' self.instance.bookingRuleName = test_value self.assertEqual(self.instance.bookingRuleName, test_value) @@ -55,7 +56,7 @@ def test_bookingRuleDesc_property(self): """ Test bookingRuleDesc property """ - test_value = 'fhvqosoxywtxphunncys' + test_value = 'ndnhusmbnhsfwyjtlutr' self.instance.bookingRuleDesc = test_value self.assertEqual(self.instance.bookingRuleDesc, test_value) @@ -63,7 +64,7 @@ def test_bookingRuleUrl_property(self): """ Test bookingRuleUrl property """ - test_value = 'swxcvgjdsdxkrdumktbj' + test_value = 'noivxplxphhffkswhrsu' self.instance.bookingRuleUrl = test_value self.assertEqual(self.instance.bookingRuleUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar.py index f1c3958..e15d533 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.calendar import Calendar from test_gtfs_rt_producer_data_generaltransitfeedstatic_serviceavailability import Test_ServiceAvailability + class Test_Calendar(unittest.TestCase): """ Test case for Calendar @@ -28,7 +29,7 @@ def create_instance(): Create instance of Calendar for testing """ instance = Calendar( - serviceId='czyphiynzkpwlrdsgldm', + serviceId='rqflmkgyhljkdwmhwdsh', monday=Test_ServiceAvailability.create_instance(), tuesday=Test_ServiceAvailability.create_instance(), wednesday=Test_ServiceAvailability.create_instance(), @@ -36,8 +37,8 @@ def create_instance(): friday=Test_ServiceAvailability.create_instance(), saturday=Test_ServiceAvailability.create_instance(), sunday=Test_ServiceAvailability.create_instance(), - startDate='pwtkdkuiofwefjjejwcc', - endDate='wzuxxvltzzmtbfjqmeaf' + startDate='xxrjdcicjstmanvblymi', + endDate='fnyyuphjxgnburumytxm' ) return instance @@ -46,7 +47,7 @@ def test_serviceId_property(self): """ Test serviceId property """ - test_value = 'czyphiynzkpwlrdsgldm' + test_value = 'rqflmkgyhljkdwmhwdsh' self.instance.serviceId = test_value self.assertEqual(self.instance.serviceId, test_value) @@ -110,7 +111,7 @@ def test_startDate_property(self): """ Test startDate property """ - test_value = 'pwtkdkuiofwefjjejwcc' + test_value = 'xxrjdcicjstmanvblymi' self.instance.startDate = test_value self.assertEqual(self.instance.startDate, test_value) @@ -118,7 +119,7 @@ def test_endDate_property(self): """ Test endDate property """ - test_value = 'wzuxxvltzzmtbfjqmeaf' + test_value = 'fnyyuphjxgnburumytxm' self.instance.endDate = test_value self.assertEqual(self.instance.endDate, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates.py index f2b7d3c..417ace9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates.py @@ -11,6 +11,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.calendardates import CalendarDates from test_gtfs_rt_producer_data_generaltransitfeedstatic_exceptiontype import Test_ExceptionType + class Test_CalendarDates(unittest.TestCase): """ Test case for CalendarDates @@ -28,8 +29,8 @@ def create_instance(): Create instance of CalendarDates for testing """ instance = CalendarDates( - serviceId='yvyupissrvwcfabsidcw', - date='esdqlpjgsbnblqmerxuj', + serviceId='bksyabmbnaiyfrzvuftq', + date='hbxpizhzrauiiijoyiww', exceptionType=Test_ExceptionType.create_instance() ) return instance @@ -39,7 +40,7 @@ def test_serviceId_property(self): """ Test serviceId property """ - test_value = 'yvyupissrvwcfabsidcw' + test_value = 'bksyabmbnaiyfrzvuftq' self.instance.serviceId = test_value self.assertEqual(self.instance.serviceId, test_value) @@ -47,7 +48,7 @@ def test_date_property(self): """ Test date property """ - test_value = 'esdqlpjgsbnblqmerxuj' + test_value = 'hbxpizhzrauiiijoyiww' self.instance.date = test_value self.assertEqual(self.instance.date, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareattributes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareattributes.py index 01abbb4..1e64877 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareattributes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareattributes.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.fareattributes import FareAttributes + class Test_FareAttributes(unittest.TestCase): """ Test case for FareAttributes @@ -27,13 +28,13 @@ def create_instance(): Create instance of FareAttributes for testing """ instance = FareAttributes( - fareId='xgnhiscetdjkzvyonscn', - price=float(23.683224915538638), - currencyType='jmogzurmosoorgeuqlbk', - paymentMethod=int(35), - transfers=int(29), - agencyId='nfyiccnhvrzwqpvvszgz', - transferDuration=int(26) + fareId='agbmnpiaoysimmubyuqu', + price=float(50.39431017920468), + currencyType='utvflopaurtoicqdcebk', + paymentMethod=int(34), + transfers=int(63), + agencyId='mewwocpcktemydtubprd', + transferDuration=int(46) ) return instance @@ -42,7 +43,7 @@ def test_fareId_property(self): """ Test fareId property """ - test_value = 'xgnhiscetdjkzvyonscn' + test_value = 'agbmnpiaoysimmubyuqu' self.instance.fareId = test_value self.assertEqual(self.instance.fareId, test_value) @@ -50,7 +51,7 @@ def test_price_property(self): """ Test price property """ - test_value = float(23.683224915538638) + test_value = float(50.39431017920468) self.instance.price = test_value self.assertEqual(self.instance.price, test_value) @@ -58,7 +59,7 @@ def test_currencyType_property(self): """ Test currencyType property """ - test_value = 'jmogzurmosoorgeuqlbk' + test_value = 'utvflopaurtoicqdcebk' self.instance.currencyType = test_value self.assertEqual(self.instance.currencyType, test_value) @@ -66,7 +67,7 @@ def test_paymentMethod_property(self): """ Test paymentMethod property """ - test_value = int(35) + test_value = int(34) self.instance.paymentMethod = test_value self.assertEqual(self.instance.paymentMethod, test_value) @@ -74,7 +75,7 @@ def test_transfers_property(self): """ Test transfers property """ - test_value = int(29) + test_value = int(63) self.instance.transfers = test_value self.assertEqual(self.instance.transfers, test_value) @@ -82,7 +83,7 @@ def test_agencyId_property(self): """ Test agencyId property """ - test_value = 'nfyiccnhvrzwqpvvszgz' + test_value = 'mewwocpcktemydtubprd' self.instance.agencyId = test_value self.assertEqual(self.instance.agencyId, test_value) @@ -90,7 +91,7 @@ def test_transferDuration_property(self): """ Test transferDuration property """ - test_value = int(26) + test_value = int(46) self.instance.transferDuration = test_value self.assertEqual(self.instance.transferDuration, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farelegrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farelegrules.py index b74c368..1480db7 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farelegrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farelegrules.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.farelegrules import FareLegRules + class Test_FareLegRules(unittest.TestCase): """ Test case for FareLegRules @@ -27,12 +28,12 @@ def create_instance(): Create instance of FareLegRules for testing """ instance = FareLegRules( - fareLegRuleId='ovfbibcihfzqcckxeemc', - fareProductId='smzefxmjdwlbgjtuvzga', - legGroupId='andtgdcrphfkqxwnxsfx', - networkId='zsomgvdfvjfxvxbnqvuc', - fromAreaId='euaxgnnmrgomzgsddrwv', - toAreaId='sevthhgbalqfxqxyzrjo' + fareLegRuleId='tyuboylsydzpahjvsqnc', + fareProductId='fyiroseqagkmghwsktma', + legGroupId='nknrbfvcrngkqsbooeyo', + networkId='zwgvvuccchpswozxbcam', + fromAreaId='yoeqbljuliczxsybcehx', + toAreaId='eondmrfxuntgelbbgrsw' ) return instance @@ -41,7 +42,7 @@ def test_fareLegRuleId_property(self): """ Test fareLegRuleId property """ - test_value = 'ovfbibcihfzqcckxeemc' + test_value = 'tyuboylsydzpahjvsqnc' self.instance.fareLegRuleId = test_value self.assertEqual(self.instance.fareLegRuleId, test_value) @@ -49,7 +50,7 @@ def test_fareProductId_property(self): """ Test fareProductId property """ - test_value = 'smzefxmjdwlbgjtuvzga' + test_value = 'fyiroseqagkmghwsktma' self.instance.fareProductId = test_value self.assertEqual(self.instance.fareProductId, test_value) @@ -57,7 +58,7 @@ def test_legGroupId_property(self): """ Test legGroupId property """ - test_value = 'andtgdcrphfkqxwnxsfx' + test_value = 'nknrbfvcrngkqsbooeyo' self.instance.legGroupId = test_value self.assertEqual(self.instance.legGroupId, test_value) @@ -65,7 +66,7 @@ def test_networkId_property(self): """ Test networkId property """ - test_value = 'zsomgvdfvjfxvxbnqvuc' + test_value = 'zwgvvuccchpswozxbcam' self.instance.networkId = test_value self.assertEqual(self.instance.networkId, test_value) @@ -73,7 +74,7 @@ def test_fromAreaId_property(self): """ Test fromAreaId property """ - test_value = 'euaxgnnmrgomzgsddrwv' + test_value = 'yoeqbljuliczxsybcehx' self.instance.fromAreaId = test_value self.assertEqual(self.instance.fromAreaId, test_value) @@ -81,7 +82,7 @@ def test_toAreaId_property(self): """ Test toAreaId property """ - test_value = 'sevthhgbalqfxqxyzrjo' + test_value = 'eondmrfxuntgelbbgrsw' self.instance.toAreaId = test_value self.assertEqual(self.instance.toAreaId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faremedia.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faremedia.py index 1262785..df6f2d6 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faremedia.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faremedia.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.faremedia import FareMedia + class Test_FareMedia(unittest.TestCase): """ Test case for FareMedia @@ -27,10 +28,10 @@ def create_instance(): Create instance of FareMedia for testing """ instance = FareMedia( - fareMediaId='egxxeuuvicnngiitlzbr', - fareMediaName='mbgcwbqqiwbboxkxtmxd', - fareMediaDesc='khykphimfpnzvnqnhgwf', - fareMediaUrl='bvmjmcqyjouethyhfbgt' + fareMediaId='dekytnmsbpqmxuimyych', + fareMediaName='lijvioehsnjilyrytpmn', + fareMediaDesc='pbuztiqdeoorpkkagwhv', + fareMediaUrl='gztgagyhfpbnagoramyn' ) return instance @@ -39,7 +40,7 @@ def test_fareMediaId_property(self): """ Test fareMediaId property """ - test_value = 'egxxeuuvicnngiitlzbr' + test_value = 'dekytnmsbpqmxuimyych' self.instance.fareMediaId = test_value self.assertEqual(self.instance.fareMediaId, test_value) @@ -47,7 +48,7 @@ def test_fareMediaName_property(self): """ Test fareMediaName property """ - test_value = 'mbgcwbqqiwbboxkxtmxd' + test_value = 'lijvioehsnjilyrytpmn' self.instance.fareMediaName = test_value self.assertEqual(self.instance.fareMediaName, test_value) @@ -55,7 +56,7 @@ def test_fareMediaDesc_property(self): """ Test fareMediaDesc property """ - test_value = 'khykphimfpnzvnqnhgwf' + test_value = 'pbuztiqdeoorpkkagwhv' self.instance.fareMediaDesc = test_value self.assertEqual(self.instance.fareMediaDesc, test_value) @@ -63,7 +64,7 @@ def test_fareMediaUrl_property(self): """ Test fareMediaUrl property """ - test_value = 'bvmjmcqyjouethyhfbgt' + test_value = 'gztgagyhfpbnagoramyn' self.instance.fareMediaUrl = test_value self.assertEqual(self.instance.fareMediaUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareproducts.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareproducts.py index d5bd2d7..3860544 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareproducts.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_fareproducts.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.fareproducts import FareProducts + class Test_FareProducts(unittest.TestCase): """ Test case for FareProducts @@ -27,10 +28,10 @@ def create_instance(): Create instance of FareProducts for testing """ instance = FareProducts( - fareProductId='oprpyrrjregyzzjaeoec', - fareProductName='julwywixcyiymxxanbut', - fareProductDesc='kedhverjwpqxtcmtggpa', - fareProductUrl='xtunmzodhyeghaxfrlma' + fareProductId='nqaxnelvwngnwrdppzdf', + fareProductName='wczckotrsjsfkkfwprrj', + fareProductDesc='xoafhfsimofdturgmwst', + fareProductUrl='nvjpmwtaipfniifeiros' ) return instance @@ -39,7 +40,7 @@ def test_fareProductId_property(self): """ Test fareProductId property """ - test_value = 'oprpyrrjregyzzjaeoec' + test_value = 'nqaxnelvwngnwrdppzdf' self.instance.fareProductId = test_value self.assertEqual(self.instance.fareProductId, test_value) @@ -47,7 +48,7 @@ def test_fareProductName_property(self): """ Test fareProductName property """ - test_value = 'julwywixcyiymxxanbut' + test_value = 'wczckotrsjsfkkfwprrj' self.instance.fareProductName = test_value self.assertEqual(self.instance.fareProductName, test_value) @@ -55,7 +56,7 @@ def test_fareProductDesc_property(self): """ Test fareProductDesc property """ - test_value = 'kedhverjwpqxtcmtggpa' + test_value = 'xoafhfsimofdturgmwst' self.instance.fareProductDesc = test_value self.assertEqual(self.instance.fareProductDesc, test_value) @@ -63,7 +64,7 @@ def test_fareProductUrl_property(self): """ Test fareProductUrl property """ - test_value = 'xtunmzodhyeghaxfrlma' + test_value = 'nvjpmwtaipfniifeiros' self.instance.fareProductUrl = test_value self.assertEqual(self.instance.fareProductUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farerules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farerules.py index 265b8c7..b176b90 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farerules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_farerules.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.farerules import FareRules + class Test_FareRules(unittest.TestCase): """ Test case for FareRules @@ -27,11 +28,11 @@ def create_instance(): Create instance of FareRules for testing """ instance = FareRules( - fareId='ttkrfcngztnpkxukaemh', - routeId='bpowcvruzanlviqweohl', - originId='qodupbavlffdtmkoytjr', - destinationId='semjqkpjulfpqzqmxxwh', - containsId='svlgjcyssrxkejenxdaz' + fareId='ttnqkqtjxbwsdobtskrl', + routeId='wulhsirwheytcyfufogc', + originId='leycyycmxousiycleakd', + destinationId='fqfhyeaimpugrgedfvbf', + containsId='ariogpzbaycafgiqojcw' ) return instance @@ -40,7 +41,7 @@ def test_fareId_property(self): """ Test fareId property """ - test_value = 'ttkrfcngztnpkxukaemh' + test_value = 'ttnqkqtjxbwsdobtskrl' self.instance.fareId = test_value self.assertEqual(self.instance.fareId, test_value) @@ -48,7 +49,7 @@ def test_routeId_property(self): """ Test routeId property """ - test_value = 'bpowcvruzanlviqweohl' + test_value = 'wulhsirwheytcyfufogc' self.instance.routeId = test_value self.assertEqual(self.instance.routeId, test_value) @@ -56,7 +57,7 @@ def test_originId_property(self): """ Test originId property """ - test_value = 'qodupbavlffdtmkoytjr' + test_value = 'leycyycmxousiycleakd' self.instance.originId = test_value self.assertEqual(self.instance.originId, test_value) @@ -64,7 +65,7 @@ def test_destinationId_property(self): """ Test destinationId property """ - test_value = 'semjqkpjulfpqzqmxxwh' + test_value = 'fqfhyeaimpugrgedfvbf' self.instance.destinationId = test_value self.assertEqual(self.instance.destinationId, test_value) @@ -72,7 +73,7 @@ def test_containsId_property(self): """ Test containsId property """ - test_value = 'svlgjcyssrxkejenxdaz' + test_value = 'ariogpzbaycafgiqojcw' self.instance.containsId = test_value self.assertEqual(self.instance.containsId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faretransferrules.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faretransferrules.py index ecab834..b7d6674 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faretransferrules.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_faretransferrules.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.faretransferrules import FareTransferRules + class Test_FareTransferRules(unittest.TestCase): """ Test case for FareTransferRules @@ -27,13 +28,13 @@ def create_instance(): Create instance of FareTransferRules for testing """ instance = FareTransferRules( - fareTransferRuleId='govgpqebhiylvomaunjq', - fareProductId='sxtgrvwhqcwytqkuvpmm', - transferCount=int(62), - fromLegGroupId='eogrgfgfjcwxirtdlmyz', - toLegGroupId='enolxksgcoeiioyiqwvo', - duration=int(50), - durationType='jjqndfqaqxyxqchkvunb' + fareTransferRuleId='exmmzijkpsafmgxplfov', + fareProductId='vkhweszmbzvntgdmhfqe', + transferCount=int(77), + fromLegGroupId='pbqnpembpbrsluswaenn', + toLegGroupId='pmykpgzlpttstciiyebc', + duration=int(6), + durationType='cmkkhwzkrcdkfiguehht' ) return instance @@ -42,7 +43,7 @@ def test_fareTransferRuleId_property(self): """ Test fareTransferRuleId property """ - test_value = 'govgpqebhiylvomaunjq' + test_value = 'exmmzijkpsafmgxplfov' self.instance.fareTransferRuleId = test_value self.assertEqual(self.instance.fareTransferRuleId, test_value) @@ -50,7 +51,7 @@ def test_fareProductId_property(self): """ Test fareProductId property """ - test_value = 'sxtgrvwhqcwytqkuvpmm' + test_value = 'vkhweszmbzvntgdmhfqe' self.instance.fareProductId = test_value self.assertEqual(self.instance.fareProductId, test_value) @@ -58,7 +59,7 @@ def test_transferCount_property(self): """ Test transferCount property """ - test_value = int(62) + test_value = int(77) self.instance.transferCount = test_value self.assertEqual(self.instance.transferCount, test_value) @@ -66,7 +67,7 @@ def test_fromLegGroupId_property(self): """ Test fromLegGroupId property """ - test_value = 'eogrgfgfjcwxirtdlmyz' + test_value = 'pbqnpembpbrsluswaenn' self.instance.fromLegGroupId = test_value self.assertEqual(self.instance.fromLegGroupId, test_value) @@ -74,7 +75,7 @@ def test_toLegGroupId_property(self): """ Test toLegGroupId property """ - test_value = 'enolxksgcoeiioyiqwvo' + test_value = 'pmykpgzlpttstciiyebc' self.instance.toLegGroupId = test_value self.assertEqual(self.instance.toLegGroupId, test_value) @@ -82,7 +83,7 @@ def test_duration_property(self): """ Test duration property """ - test_value = int(50) + test_value = int(6) self.instance.duration = test_value self.assertEqual(self.instance.duration, test_value) @@ -90,7 +91,7 @@ def test_durationType_property(self): """ Test durationType property """ - test_value = 'jjqndfqaqxyxqchkvunb' + test_value = 'cmkkhwzkrcdkfiguehht' self.instance.durationType = test_value self.assertEqual(self.instance.durationType, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_feedinfo.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_feedinfo.py index a42a1d4..55aba7a 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_feedinfo.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_feedinfo.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.feedinfo import FeedInfo + class Test_FeedInfo(unittest.TestCase): """ Test case for FeedInfo @@ -27,15 +28,15 @@ def create_instance(): Create instance of FeedInfo for testing """ instance = FeedInfo( - feedPublisherName='rgmpzecaqjtfoxcxnxxg', - feedPublisherUrl='mzzwhabkkvxbocwkickx', - feedLang='vsepzxlmktvdbetgekvh', - defaultLang='etwmytxjvkpfgxqpotpq', - feedStartDate='luwljtamyyptkqydgdsg', - feedEndDate='enphujpyyqszgtjyiipl', - feedVersion='dlyrjbursyhapiycwzps', - feedContactEmail='hmnrqbclbvzodpblnzbe', - feedContactUrl='ztjyhmnupgcrnkprtkkg' + feedPublisherName='idfpqzinuppjmyfkivks', + feedPublisherUrl='wvhrqycyhfwasikjxvrk', + feedLang='zuzglwchokaashigmsbl', + defaultLang='jztfwdwuqnhnmgaxyahs', + feedStartDate='cyhkdfbvnpqrefnxzkym', + feedEndDate='fqfamvdjrnxjkpxzegcb', + feedVersion='nptxmhhqqxnipizfuuhz', + feedContactEmail='yferfhyrftuubefaebbq', + feedContactUrl='aolzmqzcuptaartrdkrh' ) return instance @@ -44,7 +45,7 @@ def test_feedPublisherName_property(self): """ Test feedPublisherName property """ - test_value = 'rgmpzecaqjtfoxcxnxxg' + test_value = 'idfpqzinuppjmyfkivks' self.instance.feedPublisherName = test_value self.assertEqual(self.instance.feedPublisherName, test_value) @@ -52,7 +53,7 @@ def test_feedPublisherUrl_property(self): """ Test feedPublisherUrl property """ - test_value = 'mzzwhabkkvxbocwkickx' + test_value = 'wvhrqycyhfwasikjxvrk' self.instance.feedPublisherUrl = test_value self.assertEqual(self.instance.feedPublisherUrl, test_value) @@ -60,7 +61,7 @@ def test_feedLang_property(self): """ Test feedLang property """ - test_value = 'vsepzxlmktvdbetgekvh' + test_value = 'zuzglwchokaashigmsbl' self.instance.feedLang = test_value self.assertEqual(self.instance.feedLang, test_value) @@ -68,7 +69,7 @@ def test_defaultLang_property(self): """ Test defaultLang property """ - test_value = 'etwmytxjvkpfgxqpotpq' + test_value = 'jztfwdwuqnhnmgaxyahs' self.instance.defaultLang = test_value self.assertEqual(self.instance.defaultLang, test_value) @@ -76,7 +77,7 @@ def test_feedStartDate_property(self): """ Test feedStartDate property """ - test_value = 'luwljtamyyptkqydgdsg' + test_value = 'cyhkdfbvnpqrefnxzkym' self.instance.feedStartDate = test_value self.assertEqual(self.instance.feedStartDate, test_value) @@ -84,7 +85,7 @@ def test_feedEndDate_property(self): """ Test feedEndDate property """ - test_value = 'enphujpyyqszgtjyiipl' + test_value = 'fqfamvdjrnxjkpxzegcb' self.instance.feedEndDate = test_value self.assertEqual(self.instance.feedEndDate, test_value) @@ -92,7 +93,7 @@ def test_feedVersion_property(self): """ Test feedVersion property """ - test_value = 'dlyrjbursyhapiycwzps' + test_value = 'nptxmhhqqxnipizfuuhz' self.instance.feedVersion = test_value self.assertEqual(self.instance.feedVersion, test_value) @@ -100,7 +101,7 @@ def test_feedContactEmail_property(self): """ Test feedContactEmail property """ - test_value = 'hmnrqbclbvzodpblnzbe' + test_value = 'yferfhyrftuubefaebbq' self.instance.feedContactEmail = test_value self.assertEqual(self.instance.feedContactEmail, test_value) @@ -108,7 +109,7 @@ def test_feedContactUrl_property(self): """ Test feedContactUrl property """ - test_value = 'ztjyhmnupgcrnkprtkkg' + test_value = 'aolzmqzcuptaartrdkrh' self.instance.feedContactUrl = test_value self.assertEqual(self.instance.feedContactUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_frequencies.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_frequencies.py index 8702b5a..1bd73a7 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_frequencies.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_frequencies.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.frequencies import Frequencies + class Test_Frequencies(unittest.TestCase): """ Test case for Frequencies @@ -27,11 +28,11 @@ def create_instance(): Create instance of Frequencies for testing """ instance = Frequencies( - tripId='gjwwztawlakcvtocfeuj', - startTime='qhtgrdwnmgalkeukpnmb', - endTime='kdznwysjddwjmqalgach', - headwaySecs=int(86), - exactTimes=int(49) + tripId='fjxtdzfttriezvoqfmew', + startTime='mmzcybolqguwsxjhuley', + endTime='pfumxgixdnkereykdcye', + headwaySecs=int(1), + exactTimes=int(35) ) return instance @@ -40,7 +41,7 @@ def test_tripId_property(self): """ Test tripId property """ - test_value = 'gjwwztawlakcvtocfeuj' + test_value = 'fjxtdzfttriezvoqfmew' self.instance.tripId = test_value self.assertEqual(self.instance.tripId, test_value) @@ -48,7 +49,7 @@ def test_startTime_property(self): """ Test startTime property """ - test_value = 'qhtgrdwnmgalkeukpnmb' + test_value = 'mmzcybolqguwsxjhuley' self.instance.startTime = test_value self.assertEqual(self.instance.startTime, test_value) @@ -56,7 +57,7 @@ def test_endTime_property(self): """ Test endTime property """ - test_value = 'kdznwysjddwjmqalgach' + test_value = 'pfumxgixdnkereykdcye' self.instance.endTime = test_value self.assertEqual(self.instance.endTime, test_value) @@ -64,7 +65,7 @@ def test_headwaySecs_property(self): """ Test headwaySecs property """ - test_value = int(86) + test_value = int(1) self.instance.headwaySecs = test_value self.assertEqual(self.instance.headwaySecs, test_value) @@ -72,7 +73,7 @@ def test_exactTimes_property(self): """ Test exactTimes property """ - test_value = int(49) + test_value = int(35) self.instance.exactTimes = test_value self.assertEqual(self.instance.exactTimes, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_levels.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_levels.py index f86b38e..02e2eea 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_levels.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_levels.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.levels import Levels + class Test_Levels(unittest.TestCase): """ Test case for Levels @@ -27,9 +28,9 @@ def create_instance(): Create instance of Levels for testing """ instance = Levels( - levelId='khuikpzjukkqkhugtemu', - levelIndex=float(41.047808343759684), - levelName='aqxmkrdwousvjlcwkqvd' + levelId='adepwwutlzffliyzedzq', + levelIndex=float(32.29515479984838), + levelName='uvajkqgxstpohcdyihli' ) return instance @@ -38,7 +39,7 @@ def test_levelId_property(self): """ Test levelId property """ - test_value = 'khuikpzjukkqkhugtemu' + test_value = 'adepwwutlzffliyzedzq' self.instance.levelId = test_value self.assertEqual(self.instance.levelId, test_value) @@ -46,7 +47,7 @@ def test_levelIndex_property(self): """ Test levelIndex property """ - test_value = float(41.047808343759684) + test_value = float(32.29515479984838) self.instance.levelIndex = test_value self.assertEqual(self.instance.levelIndex, test_value) @@ -54,7 +55,7 @@ def test_levelName_property(self): """ Test levelName property """ - test_value = 'aqxmkrdwousvjlcwkqvd' + test_value = 'uvajkqgxstpohcdyihli' self.instance.levelName = test_value self.assertEqual(self.instance.levelName, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgeojson.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgeojson.py index c11913c..0180be9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgeojson.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgeojson.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.locationgeojson import LocationGeoJson + class Test_LocationGeoJson(unittest.TestCase): """ Test case for LocationGeoJson @@ -27,9 +28,9 @@ def create_instance(): Create instance of LocationGeoJson for testing """ instance = LocationGeoJson( - locationGeoJsonId='pgmgsuvknfusdqwtssys', - locationGeoJsonType='wqajyiaiytfcioyxsbgp', - locationGeoJsonData='uopwefqzjuwqeijyppvj' + locationGeoJsonId='icwpipfnjxctujnacjnl', + locationGeoJsonType='aihywqwwmiujbepezyeo', + locationGeoJsonData='hlkiwmzmeshjdcklbtua' ) return instance @@ -38,7 +39,7 @@ def test_locationGeoJsonId_property(self): """ Test locationGeoJsonId property """ - test_value = 'pgmgsuvknfusdqwtssys' + test_value = 'icwpipfnjxctujnacjnl' self.instance.locationGeoJsonId = test_value self.assertEqual(self.instance.locationGeoJsonId, test_value) @@ -46,7 +47,7 @@ def test_locationGeoJsonType_property(self): """ Test locationGeoJsonType property """ - test_value = 'wqajyiaiytfcioyxsbgp' + test_value = 'aihywqwwmiujbepezyeo' self.instance.locationGeoJsonType = test_value self.assertEqual(self.instance.locationGeoJsonType, test_value) @@ -54,7 +55,7 @@ def test_locationGeoJsonData_property(self): """ Test locationGeoJsonData property """ - test_value = 'uopwefqzjuwqeijyppvj' + test_value = 'hlkiwmzmeshjdcklbtua' self.instance.locationGeoJsonData = test_value self.assertEqual(self.instance.locationGeoJsonData, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroups.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroups.py index ec0f6ae..2120389 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroups.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroups.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.locationgroups import LocationGroups + class Test_LocationGroups(unittest.TestCase): """ Test case for LocationGroups @@ -27,10 +28,10 @@ def create_instance(): Create instance of LocationGroups for testing """ instance = LocationGroups( - locationGroupId='qyxeifbgyeukduqrumdg', - locationGroupName='btnwpvwlrbszisfiuals', - locationGroupDesc='ggwvrbpdguicauubqwcy', - locationGroupUrl='tbupmeyrjhnszlvgxjgu' + locationGroupId='uloebyrrwpxmcjldyhwd', + locationGroupName='dvrnopuzoaczdpncjatn', + locationGroupDesc='cqpqnvsnylvpmghokjfj', + locationGroupUrl='gczugwjrvtentfmhsond' ) return instance @@ -39,7 +40,7 @@ def test_locationGroupId_property(self): """ Test locationGroupId property """ - test_value = 'qyxeifbgyeukduqrumdg' + test_value = 'uloebyrrwpxmcjldyhwd' self.instance.locationGroupId = test_value self.assertEqual(self.instance.locationGroupId, test_value) @@ -47,7 +48,7 @@ def test_locationGroupName_property(self): """ Test locationGroupName property """ - test_value = 'btnwpvwlrbszisfiuals' + test_value = 'dvrnopuzoaczdpncjatn' self.instance.locationGroupName = test_value self.assertEqual(self.instance.locationGroupName, test_value) @@ -55,7 +56,7 @@ def test_locationGroupDesc_property(self): """ Test locationGroupDesc property """ - test_value = 'ggwvrbpdguicauubqwcy' + test_value = 'cqpqnvsnylvpmghokjfj' self.instance.locationGroupDesc = test_value self.assertEqual(self.instance.locationGroupDesc, test_value) @@ -63,7 +64,7 @@ def test_locationGroupUrl_property(self): """ Test locationGroupUrl property """ - test_value = 'tbupmeyrjhnszlvgxjgu' + test_value = 'gczugwjrvtentfmhsond' self.instance.locationGroupUrl = test_value self.assertEqual(self.instance.locationGroupUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroupstores.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroupstores.py index 55653b9..6f9388c 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroupstores.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_locationgroupstores.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.locationgroupstores import LocationGroupStores + class Test_LocationGroupStores(unittest.TestCase): """ Test case for LocationGroupStores @@ -27,9 +28,9 @@ def create_instance(): Create instance of LocationGroupStores for testing """ instance = LocationGroupStores( - locationGroupStoreId='xdsoytqjpyrvlysfncgu', - locationGroupId='ysswxsogpbiuisitrygp', - storeId='tlvtelpsorhrskowlwkp' + locationGroupStoreId='ahntjblcukfxrjrwtudc', + locationGroupId='mdkkfhjapxnxfmnxclsb', + storeId='lovkanhdbhokunqkcbzu' ) return instance @@ -38,7 +39,7 @@ def test_locationGroupStoreId_property(self): """ Test locationGroupStoreId property """ - test_value = 'xdsoytqjpyrvlysfncgu' + test_value = 'ahntjblcukfxrjrwtudc' self.instance.locationGroupStoreId = test_value self.assertEqual(self.instance.locationGroupStoreId, test_value) @@ -46,7 +47,7 @@ def test_locationGroupId_property(self): """ Test locationGroupId property """ - test_value = 'ysswxsogpbiuisitrygp' + test_value = 'mdkkfhjapxnxfmnxclsb' self.instance.locationGroupId = test_value self.assertEqual(self.instance.locationGroupId, test_value) @@ -54,7 +55,7 @@ def test_storeId_property(self): """ Test storeId property """ - test_value = 'tlvtelpsorhrskowlwkp' + test_value = 'lovkanhdbhokunqkcbzu' self.instance.storeId = test_value self.assertEqual(self.instance.storeId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_networks.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_networks.py index a77923b..1678352 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_networks.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_networks.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.networks import Networks + class Test_Networks(unittest.TestCase): """ Test case for Networks @@ -27,10 +28,10 @@ def create_instance(): Create instance of Networks for testing """ instance = Networks( - networkId='xnurkeurwhpeddoyugnk', - networkName='yexhdoojtyoiauwncgyq', - networkDesc='hglhgjwifhscbspgdtsw', - networkUrl='ukgbkvcxujynuxhudwnb' + networkId='veikseoaoxpovbdxsimj', + networkName='mmwznvmrswmnevmaijrd', + networkDesc='aawsvkpkvuyytwximukt', + networkUrl='lrsafplqzskbjxtcyisi' ) return instance @@ -39,7 +40,7 @@ def test_networkId_property(self): """ Test networkId property """ - test_value = 'xnurkeurwhpeddoyugnk' + test_value = 'veikseoaoxpovbdxsimj' self.instance.networkId = test_value self.assertEqual(self.instance.networkId, test_value) @@ -47,7 +48,7 @@ def test_networkName_property(self): """ Test networkName property """ - test_value = 'yexhdoojtyoiauwncgyq' + test_value = 'mmwznvmrswmnevmaijrd' self.instance.networkName = test_value self.assertEqual(self.instance.networkName, test_value) @@ -55,7 +56,7 @@ def test_networkDesc_property(self): """ Test networkDesc property """ - test_value = 'hglhgjwifhscbspgdtsw' + test_value = 'aawsvkpkvuyytwximukt' self.instance.networkDesc = test_value self.assertEqual(self.instance.networkDesc, test_value) @@ -63,7 +64,7 @@ def test_networkUrl_property(self): """ Test networkUrl property """ - test_value = 'ukgbkvcxujynuxhudwnb' + test_value = 'lrsafplqzskbjxtcyisi' self.instance.networkUrl = test_value self.assertEqual(self.instance.networkUrl, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_pathways.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_pathways.py index 4980671..954705e 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_pathways.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_pathways.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.pathways import Pathways + class Test_Pathways(unittest.TestCase): """ Test case for Pathways @@ -27,18 +28,18 @@ def create_instance(): Create instance of Pathways for testing """ instance = Pathways( - pathwayId='ncczfqfxmvpiyqbvduut', - fromStopId='tlwjglaugocmayhomoak', - toStopId='nrpcrfgqmrqexmhvjrkk', - pathwayMode=int(58), - isBidirectional=int(25), - length=float(97.24756695801547), - traversalTime=int(23), - stairCount=int(84), - maxSlope=float(97.3590036957665), - minWidth=float(33.23002239778553), - signpostedAs='rnnkanrcmllnqetkbfdm', - reversedSignpostedAs='idglnepwjwwvezqpnpof' + pathwayId='pohchezjmtfxrfcsxnsf', + fromStopId='luuavbyabsmhddlbqseg', + toStopId='ccmrfylxavaurrjarrnj', + pathwayMode=int(17), + isBidirectional=int(34), + length=float(11.666832648412395), + traversalTime=int(35), + stairCount=int(65), + maxSlope=float(11.160362450082618), + minWidth=float(29.954969450484846), + signpostedAs='uzkpfdsthchvaobeczwp', + reversedSignpostedAs='mubgfnwvfpnkxtkotvkg' ) return instance @@ -47,7 +48,7 @@ def test_pathwayId_property(self): """ Test pathwayId property """ - test_value = 'ncczfqfxmvpiyqbvduut' + test_value = 'pohchezjmtfxrfcsxnsf' self.instance.pathwayId = test_value self.assertEqual(self.instance.pathwayId, test_value) @@ -55,7 +56,7 @@ def test_fromStopId_property(self): """ Test fromStopId property """ - test_value = 'tlwjglaugocmayhomoak' + test_value = 'luuavbyabsmhddlbqseg' self.instance.fromStopId = test_value self.assertEqual(self.instance.fromStopId, test_value) @@ -63,7 +64,7 @@ def test_toStopId_property(self): """ Test toStopId property """ - test_value = 'nrpcrfgqmrqexmhvjrkk' + test_value = 'ccmrfylxavaurrjarrnj' self.instance.toStopId = test_value self.assertEqual(self.instance.toStopId, test_value) @@ -71,7 +72,7 @@ def test_pathwayMode_property(self): """ Test pathwayMode property """ - test_value = int(58) + test_value = int(17) self.instance.pathwayMode = test_value self.assertEqual(self.instance.pathwayMode, test_value) @@ -79,7 +80,7 @@ def test_isBidirectional_property(self): """ Test isBidirectional property """ - test_value = int(25) + test_value = int(34) self.instance.isBidirectional = test_value self.assertEqual(self.instance.isBidirectional, test_value) @@ -87,7 +88,7 @@ def test_length_property(self): """ Test length property """ - test_value = float(97.24756695801547) + test_value = float(11.666832648412395) self.instance.length = test_value self.assertEqual(self.instance.length, test_value) @@ -95,7 +96,7 @@ def test_traversalTime_property(self): """ Test traversalTime property """ - test_value = int(23) + test_value = int(35) self.instance.traversalTime = test_value self.assertEqual(self.instance.traversalTime, test_value) @@ -103,7 +104,7 @@ def test_stairCount_property(self): """ Test stairCount property """ - test_value = int(84) + test_value = int(65) self.instance.stairCount = test_value self.assertEqual(self.instance.stairCount, test_value) @@ -111,7 +112,7 @@ def test_maxSlope_property(self): """ Test maxSlope property """ - test_value = float(97.3590036957665) + test_value = float(11.160362450082618) self.instance.maxSlope = test_value self.assertEqual(self.instance.maxSlope, test_value) @@ -119,7 +120,7 @@ def test_minWidth_property(self): """ Test minWidth property """ - test_value = float(33.23002239778553) + test_value = float(29.954969450484846) self.instance.minWidth = test_value self.assertEqual(self.instance.minWidth, test_value) @@ -127,7 +128,7 @@ def test_signpostedAs_property(self): """ Test signpostedAs property """ - test_value = 'rnnkanrcmllnqetkbfdm' + test_value = 'uzkpfdsthchvaobeczwp' self.instance.signpostedAs = test_value self.assertEqual(self.instance.signpostedAs, test_value) @@ -135,7 +136,7 @@ def test_reversedSignpostedAs_property(self): """ Test reversedSignpostedAs property """ - test_value = 'idglnepwjwwvezqpnpof' + test_value = 'mubgfnwvfpnkxtkotvkg' self.instance.reversedSignpostedAs = test_value self.assertEqual(self.instance.reversedSignpostedAs, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routenetworks.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routenetworks.py index e9e51d9..589d6a4 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routenetworks.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routenetworks.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.routenetworks import RouteNetworks + class Test_RouteNetworks(unittest.TestCase): """ Test case for RouteNetworks @@ -27,9 +28,9 @@ def create_instance(): Create instance of RouteNetworks for testing """ instance = RouteNetworks( - routeNetworkId='tfxlwmwjisqmpqrwjhny', - routeId='sguuqhxupbwtfvyvnvfu', - networkId='erkydgrinqnyoyrqkdvt' + routeNetworkId='lbpbmrsqzamoplzztbak', + routeId='dkisralwzfmaqhfoupqx', + networkId='vvcbnmrxhahzrfzmaeet' ) return instance @@ -38,7 +39,7 @@ def test_routeNetworkId_property(self): """ Test routeNetworkId property """ - test_value = 'tfxlwmwjisqmpqrwjhny' + test_value = 'lbpbmrsqzamoplzztbak' self.instance.routeNetworkId = test_value self.assertEqual(self.instance.routeNetworkId, test_value) @@ -46,7 +47,7 @@ def test_routeId_property(self): """ Test routeId property """ - test_value = 'sguuqhxupbwtfvyvnvfu' + test_value = 'dkisralwzfmaqhfoupqx' self.instance.routeId = test_value self.assertEqual(self.instance.routeId, test_value) @@ -54,7 +55,7 @@ def test_networkId_property(self): """ Test networkId property """ - test_value = 'erkydgrinqnyoyrqkdvt' + test_value = 'vvcbnmrxhahzrfzmaeet' self.instance.networkId = test_value self.assertEqual(self.instance.networkId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routes.py index 02ce830..4a579d9 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_routes.py @@ -9,10 +9,11 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedstatic.routes import Routes -from test_gtfs_rt_producer_data_generaltransitfeedstatic_routetype import Test_RouteType from test_gtfs_rt_producer_data_generaltransitfeedstatic_continuousdropoff import Test_ContinuousDropOff +from test_gtfs_rt_producer_data_generaltransitfeedstatic_routetype import Test_RouteType from test_gtfs_rt_producer_data_generaltransitfeedstatic_continuouspickup import Test_ContinuousPickup + class Test_Routes(unittest.TestCase): """ Test case for Routes @@ -30,19 +31,19 @@ def create_instance(): Create instance of Routes for testing """ instance = Routes( - routeId='khtwstgrvxtnnyrqxuop', - agencyId='xsyechecyvweqybyaizt', - routeShortName='yxwfnocgwicqnmuvtopd', - routeLongName='ixccibbpjsmwyxqsxykl', - routeDesc='alqqgjjltswhsxhcylfb', + routeId='txcwbtfxpexbrhobrjpp', + agencyId='torcsxysxlefpuqnyglk', + routeShortName='archwdwgkwbcmmtwtqma', + routeLongName='qyytldoykhfaadsaxgsx', + routeDesc='moisytwbuyrewzulfctj', routeType=Test_RouteType.create_instance(), - routeUrl='xzbieevtyprndfirjdpv', - routeColor='ulkcevhbsbncjasfbins', - routeTextColor='rttubzskbiszonpuhqmu', - routeSortOrder=int(32), + routeUrl='bniejdotuwkvutvyvglv', + routeColor='ljuvwbbapgsuybjkblfy', + routeTextColor='gcbtergkmnicbimufjvm', + routeSortOrder=int(30), continuousPickup=Test_ContinuousPickup.create_instance(), continuousDropOff=Test_ContinuousDropOff.create_instance(), - networkId='oxuldrxkfscalkfkwpmy' + networkId='rhfolkfgduuxyrajlgqo' ) return instance @@ -51,7 +52,7 @@ def test_routeId_property(self): """ Test routeId property """ - test_value = 'khtwstgrvxtnnyrqxuop' + test_value = 'txcwbtfxpexbrhobrjpp' self.instance.routeId = test_value self.assertEqual(self.instance.routeId, test_value) @@ -59,7 +60,7 @@ def test_agencyId_property(self): """ Test agencyId property """ - test_value = 'xsyechecyvweqybyaizt' + test_value = 'torcsxysxlefpuqnyglk' self.instance.agencyId = test_value self.assertEqual(self.instance.agencyId, test_value) @@ -67,7 +68,7 @@ def test_routeShortName_property(self): """ Test routeShortName property """ - test_value = 'yxwfnocgwicqnmuvtopd' + test_value = 'archwdwgkwbcmmtwtqma' self.instance.routeShortName = test_value self.assertEqual(self.instance.routeShortName, test_value) @@ -75,7 +76,7 @@ def test_routeLongName_property(self): """ Test routeLongName property """ - test_value = 'ixccibbpjsmwyxqsxykl' + test_value = 'qyytldoykhfaadsaxgsx' self.instance.routeLongName = test_value self.assertEqual(self.instance.routeLongName, test_value) @@ -83,7 +84,7 @@ def test_routeDesc_property(self): """ Test routeDesc property """ - test_value = 'alqqgjjltswhsxhcylfb' + test_value = 'moisytwbuyrewzulfctj' self.instance.routeDesc = test_value self.assertEqual(self.instance.routeDesc, test_value) @@ -99,7 +100,7 @@ def test_routeUrl_property(self): """ Test routeUrl property """ - test_value = 'xzbieevtyprndfirjdpv' + test_value = 'bniejdotuwkvutvyvglv' self.instance.routeUrl = test_value self.assertEqual(self.instance.routeUrl, test_value) @@ -107,7 +108,7 @@ def test_routeColor_property(self): """ Test routeColor property """ - test_value = 'ulkcevhbsbncjasfbins' + test_value = 'ljuvwbbapgsuybjkblfy' self.instance.routeColor = test_value self.assertEqual(self.instance.routeColor, test_value) @@ -115,7 +116,7 @@ def test_routeTextColor_property(self): """ Test routeTextColor property """ - test_value = 'rttubzskbiszonpuhqmu' + test_value = 'gcbtergkmnicbimufjvm' self.instance.routeTextColor = test_value self.assertEqual(self.instance.routeTextColor, test_value) @@ -123,7 +124,7 @@ def test_routeSortOrder_property(self): """ Test routeSortOrder property """ - test_value = int(32) + test_value = int(30) self.instance.routeSortOrder = test_value self.assertEqual(self.instance.routeSortOrder, test_value) @@ -147,7 +148,7 @@ def test_networkId_property(self): """ Test networkId property """ - test_value = 'oxuldrxkfscalkfkwpmy' + test_value = 'rhfolkfgduuxyrajlgqo' self.instance.networkId = test_value self.assertEqual(self.instance.networkId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_shapes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_shapes.py index 2b5cc52..80fa4bf 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_shapes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_shapes.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.shapes import Shapes + class Test_Shapes(unittest.TestCase): """ Test case for Shapes @@ -27,11 +28,11 @@ def create_instance(): Create instance of Shapes for testing """ instance = Shapes( - shapeId='syjipszpgohctkkuqhgt', - shapePtLat=float(25.30099856513074), - shapePtLon=float(63.30384579599435), - shapePtSequence=int(99), - shapeDistTraveled=float(21.771484451242472) + shapeId='axylcavqsovayyvefghx', + shapePtLat=float(37.73770684631462), + shapePtLon=float(84.84434094604721), + shapePtSequence=int(25), + shapeDistTraveled=float(94.33924334713187) ) return instance @@ -40,7 +41,7 @@ def test_shapeId_property(self): """ Test shapeId property """ - test_value = 'syjipszpgohctkkuqhgt' + test_value = 'axylcavqsovayyvefghx' self.instance.shapeId = test_value self.assertEqual(self.instance.shapeId, test_value) @@ -48,7 +49,7 @@ def test_shapePtLat_property(self): """ Test shapePtLat property """ - test_value = float(25.30099856513074) + test_value = float(37.73770684631462) self.instance.shapePtLat = test_value self.assertEqual(self.instance.shapePtLat, test_value) @@ -56,7 +57,7 @@ def test_shapePtLon_property(self): """ Test shapePtLon property """ - test_value = float(63.30384579599435) + test_value = float(84.84434094604721) self.instance.shapePtLon = test_value self.assertEqual(self.instance.shapePtLon, test_value) @@ -64,7 +65,7 @@ def test_shapePtSequence_property(self): """ Test shapePtSequence property """ - test_value = int(99) + test_value = int(25) self.instance.shapePtSequence = test_value self.assertEqual(self.instance.shapePtSequence, test_value) @@ -72,7 +73,7 @@ def test_shapeDistTraveled_property(self): """ Test shapeDistTraveled property """ - test_value = float(21.771484451242472) + test_value = float(94.33924334713187) self.instance.shapeDistTraveled = test_value self.assertEqual(self.instance.shapeDistTraveled, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stopareas.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stopareas.py index e7e3413..3048f0d 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stopareas.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stopareas.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.stopareas import StopAreas + class Test_StopAreas(unittest.TestCase): """ Test case for StopAreas @@ -27,9 +28,9 @@ def create_instance(): Create instance of StopAreas for testing """ instance = StopAreas( - stopAreaId='xruuibulaibglmyypdls', - stopId='chikveynmseelbnknvol', - areaId='wyuzjkoghnqgkvlscnnc' + stopAreaId='hvvicnkgoicsxotvjbpj', + stopId='lthsuooeclhrcjijjshc', + areaId='glzsrbvmdvpyglleeskh' ) return instance @@ -38,7 +39,7 @@ def test_stopAreaId_property(self): """ Test stopAreaId property """ - test_value = 'xruuibulaibglmyypdls' + test_value = 'hvvicnkgoicsxotvjbpj' self.instance.stopAreaId = test_value self.assertEqual(self.instance.stopAreaId, test_value) @@ -46,7 +47,7 @@ def test_stopId_property(self): """ Test stopId property """ - test_value = 'chikveynmseelbnknvol' + test_value = 'lthsuooeclhrcjijjshc' self.instance.stopId = test_value self.assertEqual(self.instance.stopId, test_value) @@ -54,7 +55,7 @@ def test_areaId_property(self): """ Test areaId property """ - test_value = 'wyuzjkoghnqgkvlscnnc' + test_value = 'glzsrbvmdvpyglleeskh' self.instance.areaId = test_value self.assertEqual(self.instance.areaId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stops.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stops.py index d75f3f9..df7415d 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stops.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stops.py @@ -9,8 +9,9 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedstatic.stops import Stops -from test_gtfs_rt_producer_data_generaltransitfeedstatic_locationtype import Test_LocationType from test_gtfs_rt_producer_data_generaltransitfeedstatic_wheelchairboarding import Test_WheelchairBoarding +from test_gtfs_rt_producer_data_generaltransitfeedstatic_locationtype import Test_LocationType + class Test_Stops(unittest.TestCase): """ @@ -29,21 +30,21 @@ def create_instance(): Create instance of Stops for testing """ instance = Stops( - stopId='ygelzavdoynttizcgskl', - stopCode='geveaivznoyqusqqmfsw', - stopName='bjzawhdagdzcqbqgsrhl', - ttsStopName='fiqoctnbgrbhqbxxmkmy', - stopDesc='ljgsaerawnntqevejzwi', - stopLat=float(13.65830986702946), - stopLon=float(86.15415069825245), - zoneId='ccaynhxasrjntmgregdx', - stopUrl='njqftyuhgosizrtpsuel', + stopId='kyhxqibvvupddmhltsjn', + stopCode='mwqfqdzkiesentxcssmu', + stopName='czbxilhppzrkznvtuqne', + ttsStopName='qcszkwtculmzvncszyut', + stopDesc='rvyxokbnyoqgwewrumyn', + stopLat=float(53.57772366090949), + stopLon=float(24.860355743303465), + zoneId='okeiqpjkytzelnesbcoi', + stopUrl='dloyefsjlvyytsvjqdys', locationType=Test_LocationType.create_instance(), - parentStation='msvuatzaffuldieiruuy', - stopTimezone='kinxxdzomshwpxtyodnv', + parentStation='cohsncpjxlsmixydwojh', + stopTimezone='pmxooqdxecfcbupbtzrs', wheelchairBoarding=Test_WheelchairBoarding.create_instance(), - levelId='yjparzsdtkvscoezsjhi', - platformCode='tretlpllemrwsnpxiqso' + levelId='qgikbojtujmhhgtkhzvx', + platformCode='hokmzcfwlrfxixfajxtl' ) return instance @@ -52,7 +53,7 @@ def test_stopId_property(self): """ Test stopId property """ - test_value = 'ygelzavdoynttizcgskl' + test_value = 'kyhxqibvvupddmhltsjn' self.instance.stopId = test_value self.assertEqual(self.instance.stopId, test_value) @@ -60,7 +61,7 @@ def test_stopCode_property(self): """ Test stopCode property """ - test_value = 'geveaivznoyqusqqmfsw' + test_value = 'mwqfqdzkiesentxcssmu' self.instance.stopCode = test_value self.assertEqual(self.instance.stopCode, test_value) @@ -68,7 +69,7 @@ def test_stopName_property(self): """ Test stopName property """ - test_value = 'bjzawhdagdzcqbqgsrhl' + test_value = 'czbxilhppzrkznvtuqne' self.instance.stopName = test_value self.assertEqual(self.instance.stopName, test_value) @@ -76,7 +77,7 @@ def test_ttsStopName_property(self): """ Test ttsStopName property """ - test_value = 'fiqoctnbgrbhqbxxmkmy' + test_value = 'qcszkwtculmzvncszyut' self.instance.ttsStopName = test_value self.assertEqual(self.instance.ttsStopName, test_value) @@ -84,7 +85,7 @@ def test_stopDesc_property(self): """ Test stopDesc property """ - test_value = 'ljgsaerawnntqevejzwi' + test_value = 'rvyxokbnyoqgwewrumyn' self.instance.stopDesc = test_value self.assertEqual(self.instance.stopDesc, test_value) @@ -92,7 +93,7 @@ def test_stopLat_property(self): """ Test stopLat property """ - test_value = float(13.65830986702946) + test_value = float(53.57772366090949) self.instance.stopLat = test_value self.assertEqual(self.instance.stopLat, test_value) @@ -100,7 +101,7 @@ def test_stopLon_property(self): """ Test stopLon property """ - test_value = float(86.15415069825245) + test_value = float(24.860355743303465) self.instance.stopLon = test_value self.assertEqual(self.instance.stopLon, test_value) @@ -108,7 +109,7 @@ def test_zoneId_property(self): """ Test zoneId property """ - test_value = 'ccaynhxasrjntmgregdx' + test_value = 'okeiqpjkytzelnesbcoi' self.instance.zoneId = test_value self.assertEqual(self.instance.zoneId, test_value) @@ -116,7 +117,7 @@ def test_stopUrl_property(self): """ Test stopUrl property """ - test_value = 'njqftyuhgosizrtpsuel' + test_value = 'dloyefsjlvyytsvjqdys' self.instance.stopUrl = test_value self.assertEqual(self.instance.stopUrl, test_value) @@ -132,7 +133,7 @@ def test_parentStation_property(self): """ Test parentStation property """ - test_value = 'msvuatzaffuldieiruuy' + test_value = 'cohsncpjxlsmixydwojh' self.instance.parentStation = test_value self.assertEqual(self.instance.parentStation, test_value) @@ -140,7 +141,7 @@ def test_stopTimezone_property(self): """ Test stopTimezone property """ - test_value = 'kinxxdzomshwpxtyodnv' + test_value = 'pmxooqdxecfcbupbtzrs' self.instance.stopTimezone = test_value self.assertEqual(self.instance.stopTimezone, test_value) @@ -156,7 +157,7 @@ def test_levelId_property(self): """ Test levelId property """ - test_value = 'yjparzsdtkvscoezsjhi' + test_value = 'qgikbojtujmhhgtkhzvx' self.instance.levelId = test_value self.assertEqual(self.instance.levelId, test_value) @@ -164,7 +165,7 @@ def test_platformCode_property(self): """ Test platformCode property """ - test_value = 'tretlpllemrwsnpxiqso' + test_value = 'hokmzcfwlrfxixfajxtl' self.instance.platformCode = test_value self.assertEqual(self.instance.platformCode, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stoptimes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stoptimes.py index c7144dc..f7f11be 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stoptimes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_stoptimes.py @@ -9,12 +9,13 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedstatic.stoptimes import StopTimes -from test_gtfs_rt_producer_data_generaltransitfeedstatic_continuousdropoff import Test_ContinuousDropOff from test_gtfs_rt_producer_data_generaltransitfeedstatic_timepoint import Test_Timepoint from test_gtfs_rt_producer_data_generaltransitfeedstatic_pickuptype import Test_PickupType +from test_gtfs_rt_producer_data_generaltransitfeedstatic_continuousdropoff import Test_ContinuousDropOff from test_gtfs_rt_producer_data_generaltransitfeedstatic_dropofftype import Test_DropOffType from test_gtfs_rt_producer_data_generaltransitfeedstatic_continuouspickup import Test_ContinuousPickup + class Test_StopTimes(unittest.TestCase): """ Test case for StopTimes @@ -32,17 +33,17 @@ def create_instance(): Create instance of StopTimes for testing """ instance = StopTimes( - tripId='kxqltkgfybcoqlnjjukg', - arrivalTime='jogfrhzsayymqdlfncba', - departureTime='wynodxhdzrlvbvhljkqx', - stopId='smvmehwizmcekkwvnywl', - stopSequence=int(33), - stopHeadsign='jwsyoixixjnxpdmolcgm', + tripId='ksotvevphvayhopfrtac', + arrivalTime='ebiucpzumxeccfeyrila', + departureTime='cfxsnvlfkmqnvnvzcdkh', + stopId='vpaxomwrwkceaermperi', + stopSequence=int(34), + stopHeadsign='nyqpyjrycfpfzppniptj', pickupType=Test_PickupType.create_instance(), dropOffType=Test_DropOffType.create_instance(), continuousPickup=Test_ContinuousPickup.create_instance(), continuousDropOff=Test_ContinuousDropOff.create_instance(), - shapeDistTraveled=float(4.277457202604884), + shapeDistTraveled=float(23.959564066857787), timepoint=Test_Timepoint.create_instance() ) return instance @@ -52,7 +53,7 @@ def test_tripId_property(self): """ Test tripId property """ - test_value = 'kxqltkgfybcoqlnjjukg' + test_value = 'ksotvevphvayhopfrtac' self.instance.tripId = test_value self.assertEqual(self.instance.tripId, test_value) @@ -60,7 +61,7 @@ def test_arrivalTime_property(self): """ Test arrivalTime property """ - test_value = 'jogfrhzsayymqdlfncba' + test_value = 'ebiucpzumxeccfeyrila' self.instance.arrivalTime = test_value self.assertEqual(self.instance.arrivalTime, test_value) @@ -68,7 +69,7 @@ def test_departureTime_property(self): """ Test departureTime property """ - test_value = 'wynodxhdzrlvbvhljkqx' + test_value = 'cfxsnvlfkmqnvnvzcdkh' self.instance.departureTime = test_value self.assertEqual(self.instance.departureTime, test_value) @@ -76,7 +77,7 @@ def test_stopId_property(self): """ Test stopId property """ - test_value = 'smvmehwizmcekkwvnywl' + test_value = 'vpaxomwrwkceaermperi' self.instance.stopId = test_value self.assertEqual(self.instance.stopId, test_value) @@ -84,7 +85,7 @@ def test_stopSequence_property(self): """ Test stopSequence property """ - test_value = int(33) + test_value = int(34) self.instance.stopSequence = test_value self.assertEqual(self.instance.stopSequence, test_value) @@ -92,7 +93,7 @@ def test_stopHeadsign_property(self): """ Test stopHeadsign property """ - test_value = 'jwsyoixixjnxpdmolcgm' + test_value = 'nyqpyjrycfpfzppniptj' self.instance.stopHeadsign = test_value self.assertEqual(self.instance.stopHeadsign, test_value) @@ -132,7 +133,7 @@ def test_shapeDistTraveled_property(self): """ Test shapeDistTraveled property """ - test_value = float(4.277457202604884) + test_value = float(23.959564066857787) self.instance.shapeDistTraveled = test_value self.assertEqual(self.instance.shapeDistTraveled, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_timeframes.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_timeframes.py index 2793074..f8b2471 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_timeframes.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_timeframes.py @@ -12,6 +12,7 @@ from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar import Test_Calendar from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates import Test_CalendarDates + class Test_Timeframes(unittest.TestCase): """ Test case for Timeframes @@ -29,9 +30,9 @@ def create_instance(): Create instance of Timeframes for testing """ instance = Timeframes( - timeframeGroupId='vpdgdylsmcmfimmpdseu', - startTime='drxjyjgbqwehmkiffamr', - endTime='mbyonlwqcsqfnokydnnh', + timeframeGroupId='eztdcirkdgljiwpgideb', + startTime='opgpjebgrhrkauqxxohe', + endTime='ifartiukhxrhpcisoxga', serviceDates=Test_Calendar.create_instance() ) return instance @@ -41,7 +42,7 @@ def test_timeframeGroupId_property(self): """ Test timeframeGroupId property """ - test_value = 'vpdgdylsmcmfimmpdseu' + test_value = 'eztdcirkdgljiwpgideb' self.instance.timeframeGroupId = test_value self.assertEqual(self.instance.timeframeGroupId, test_value) @@ -49,7 +50,7 @@ def test_startTime_property(self): """ Test startTime property """ - test_value = 'drxjyjgbqwehmkiffamr' + test_value = 'opgpjebgrhrkauqxxohe' self.instance.startTime = test_value self.assertEqual(self.instance.startTime, test_value) @@ -57,7 +58,7 @@ def test_endTime_property(self): """ Test endTime property """ - test_value = 'mbyonlwqcsqfnokydnnh' + test_value = 'ifartiukhxrhpcisoxga' self.instance.endTime = test_value self.assertEqual(self.instance.endTime, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_transfers.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_transfers.py index 3920e8c..fe7350b 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_transfers.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_transfers.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.transfers import Transfers + class Test_Transfers(unittest.TestCase): """ Test case for Transfers @@ -27,10 +28,10 @@ def create_instance(): Create instance of Transfers for testing """ instance = Transfers( - fromStopId='erxflrjofhxrdkrrmchc', - toStopId='tmmekkjhkpyhpddyfeua', - transferType=int(93), - minTransferTime=int(2) + fromStopId='zpvffixxsukcbbapkwaq', + toStopId='xkipwgtdzfkrvuctncdk', + transferType=int(98), + minTransferTime=int(77) ) return instance @@ -39,7 +40,7 @@ def test_fromStopId_property(self): """ Test fromStopId property """ - test_value = 'erxflrjofhxrdkrrmchc' + test_value = 'zpvffixxsukcbbapkwaq' self.instance.fromStopId = test_value self.assertEqual(self.instance.fromStopId, test_value) @@ -47,7 +48,7 @@ def test_toStopId_property(self): """ Test toStopId property """ - test_value = 'tmmekkjhkpyhpddyfeua' + test_value = 'xkipwgtdzfkrvuctncdk' self.instance.toStopId = test_value self.assertEqual(self.instance.toStopId, test_value) @@ -55,7 +56,7 @@ def test_transferType_property(self): """ Test transferType property """ - test_value = int(93) + test_value = int(98) self.instance.transferType = test_value self.assertEqual(self.instance.transferType, test_value) @@ -63,7 +64,7 @@ def test_minTransferTime_property(self): """ Test minTransferTime property """ - test_value = int(2) + test_value = int(77) self.instance.minTransferTime = test_value self.assertEqual(self.instance.minTransferTime, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_translations.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_translations.py index 37595ee..b5aa000 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_translations.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_translations.py @@ -10,6 +10,7 @@ from gtfs_rt_producer_data.generaltransitfeedstatic.translations import Translations + class Test_Translations(unittest.TestCase): """ Test case for Translations @@ -27,10 +28,10 @@ def create_instance(): Create instance of Translations for testing """ instance = Translations( - tableName='yknftywydzfsgkhkrjcr', - fieldName='mzzclrhafrklvvmeennl', - language='nmnhnxujktjszazvlyoe', - translation='tmsstuwwhniszwpvpzkq' + tableName='aanhizixqqpcfjufuqom', + fieldName='qwtvuaxuexmjiacvuhma', + language='hrtvteidgzibhlqybpgq', + translation='kcjmaskfeydtyoqxdwzk' ) return instance @@ -39,7 +40,7 @@ def test_tableName_property(self): """ Test tableName property """ - test_value = 'yknftywydzfsgkhkrjcr' + test_value = 'aanhizixqqpcfjufuqom' self.instance.tableName = test_value self.assertEqual(self.instance.tableName, test_value) @@ -47,7 +48,7 @@ def test_fieldName_property(self): """ Test fieldName property """ - test_value = 'mzzclrhafrklvvmeennl' + test_value = 'qwtvuaxuexmjiacvuhma' self.instance.fieldName = test_value self.assertEqual(self.instance.fieldName, test_value) @@ -55,7 +56,7 @@ def test_language_property(self): """ Test language property """ - test_value = 'nmnhnxujktjszazvlyoe' + test_value = 'hrtvteidgzibhlqybpgq' self.instance.language = test_value self.assertEqual(self.instance.language, test_value) @@ -63,7 +64,7 @@ def test_translation_property(self): """ Test translation property """ - test_value = 'tmsstuwwhniszwpvpzkq' + test_value = 'kcjmaskfeydtyoqxdwzk' self.instance.translation = test_value self.assertEqual(self.instance.translation, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_trips.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_trips.py index ad30671..7289ce2 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_trips.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_data/tests/test_gtfs_rt_producer_data_generaltransitfeedstatic_trips.py @@ -9,11 +9,12 @@ sys.path.append(os.path.realpath(os.path.join(os.path.dirname(__file__), '../src'.replace('/', os.sep)))) from gtfs_rt_producer_data.generaltransitfeedstatic.trips import Trips -from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar import Test_Calendar -from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates import Test_CalendarDates from test_gtfs_rt_producer_data_generaltransitfeedstatic_wheelchairaccessible import Test_WheelchairAccessible +from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendar import Test_Calendar from test_gtfs_rt_producer_data_generaltransitfeedstatic_directionid import Test_DirectionId from test_gtfs_rt_producer_data_generaltransitfeedstatic_bikesallowed import Test_BikesAllowed +from test_gtfs_rt_producer_data_generaltransitfeedstatic_calendardates import Test_CalendarDates + class Test_Trips(unittest.TestCase): """ @@ -32,15 +33,15 @@ def create_instance(): Create instance of Trips for testing """ instance = Trips( - routeId='qzqqtajjczwhyujonbsp', + routeId='jnkltylqdrbhkgwalqpm', serviceDates=Test_Calendar.create_instance(), - serviceExceptions=[Test_CalendarDates.create_instance()], - tripId='tghtthpadsnrynykmltt', - tripHeadsign='yzzfelgnfsphmrsvhxpm', - tripShortName='wiwnlylrdvbmrjsfagux', + serviceExceptions=[Test_CalendarDates.create_instance(), Test_CalendarDates.create_instance()], + tripId='hhdkbjxdqscfinwadzyi', + tripHeadsign='pnkvoidmlnzzepumdbic', + tripShortName='mipqouthkhusfpaaqnsx', directionId=Test_DirectionId.create_instance(), - blockId='hosulvqiuysgpokvrgte', - shapeId='vwgbycsbtjagolpwqtga', + blockId='bmkjrmlmeohgafbdxsux', + shapeId='zzwxmbrjfotcwlvbsgpx', wheelchairAccessible=Test_WheelchairAccessible.create_instance(), bikesAllowed=Test_BikesAllowed.create_instance() ) @@ -51,7 +52,7 @@ def test_routeId_property(self): """ Test routeId property """ - test_value = 'qzqqtajjczwhyujonbsp' + test_value = 'jnkltylqdrbhkgwalqpm' self.instance.routeId = test_value self.assertEqual(self.instance.routeId, test_value) @@ -67,7 +68,7 @@ def test_serviceExceptions_property(self): """ Test serviceExceptions property """ - test_value = [Test_CalendarDates.create_instance()] + test_value = [Test_CalendarDates.create_instance(), Test_CalendarDates.create_instance()] self.instance.serviceExceptions = test_value self.assertEqual(self.instance.serviceExceptions, test_value) @@ -75,7 +76,7 @@ def test_tripId_property(self): """ Test tripId property """ - test_value = 'tghtthpadsnrynykmltt' + test_value = 'hhdkbjxdqscfinwadzyi' self.instance.tripId = test_value self.assertEqual(self.instance.tripId, test_value) @@ -83,7 +84,7 @@ def test_tripHeadsign_property(self): """ Test tripHeadsign property """ - test_value = 'yzzfelgnfsphmrsvhxpm' + test_value = 'pnkvoidmlnzzepumdbic' self.instance.tripHeadsign = test_value self.assertEqual(self.instance.tripHeadsign, test_value) @@ -91,7 +92,7 @@ def test_tripShortName_property(self): """ Test tripShortName property """ - test_value = 'wiwnlylrdvbmrjsfagux' + test_value = 'mipqouthkhusfpaaqnsx' self.instance.tripShortName = test_value self.assertEqual(self.instance.tripShortName, test_value) @@ -107,7 +108,7 @@ def test_blockId_property(self): """ Test blockId property """ - test_value = 'hosulvqiuysgpokvrgte' + test_value = 'bmkjrmlmeohgafbdxsux' self.instance.blockId = test_value self.assertEqual(self.instance.blockId, test_value) @@ -115,7 +116,7 @@ def test_shapeId_property(self): """ Test shapeId property """ - test_value = 'vwgbycsbtjagolpwqtga' + test_value = 'zzwxmbrjfotcwlvbsgpx' self.instance.shapeId = test_value self.assertEqual(self.instance.shapeId, test_value) diff --git a/gtfs/gtfs_rt_producer/gtfs_rt_producer_kafka_producer/src/gtfs_rt_producer_kafka_producer/producer.py b/gtfs/gtfs_rt_producer/gtfs_rt_producer_kafka_producer/src/gtfs_rt_producer_kafka_producer/producer.py index ab11ec9..d77f14a 100644 --- a/gtfs/gtfs_rt_producer/gtfs_rt_producer_kafka_producer/src/gtfs_rt_producer_kafka_producer/producer.py +++ b/gtfs/gtfs_rt_producer/gtfs_rt_producer_kafka_producer/src/gtfs_rt_producer_kafka_producer/producer.py @@ -89,7 +89,7 @@ async def send_general_transit_feed_real_time_vehicle_vehicle_position(self,_fee attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -99,55 +99,6 @@ async def send_general_transit_feed_real_time_vehicle_vehicle_position(self,_fee if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedRealTimeEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_real_time_trip_trip_update(self,_feedurl : str, _agencyid : str, data: TripUpdate, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, TripUpdate], str]=None) -> None: """ @@ -171,7 +122,7 @@ async def send_general_transit_feed_real_time_trip_trip_update(self,_feedurl : s attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -181,55 +132,6 @@ async def send_general_transit_feed_real_time_trip_trip_update(self,_feedurl : s if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedRealTimeEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_real_time_alert_alert(self,_feedurl : str, _agencyid : str, data: Alert, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Alert], str]=None) -> None: """ @@ -253,7 +155,7 @@ async def send_general_transit_feed_real_time_alert_alert(self,_feedurl : str, _ attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -263,6 +165,7 @@ async def send_general_transit_feed_real_time_alert_alert(self,_feedurl : str, _ if flush_producer: self.producer.flush() + @classmethod def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: """ @@ -364,7 +267,7 @@ async def send_general_transit_feed_static_agency(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -374,55 +277,6 @@ async def send_general_transit_feed_static_agency(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_areas(self,_feedurl : str, _agencyid : str, data: Areas, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Areas], str]=None) -> None: """ @@ -446,7 +300,7 @@ async def send_general_transit_feed_static_areas(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -456,55 +310,6 @@ async def send_general_transit_feed_static_areas(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_attributions(self,_feedurl : str, _agencyid : str, data: Attributions, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Attributions], str]=None) -> None: """ @@ -528,7 +333,7 @@ async def send_general_transit_feed_static_attributions(self,_feedurl : str, _ag attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -538,55 +343,6 @@ async def send_general_transit_feed_static_attributions(self,_feedurl : str, _ag if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_booking_rules(self,_feedurl : str, _agencyid : str, data: BookingRules, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, BookingRules], str]=None) -> None: """ @@ -610,7 +366,7 @@ async def send_general_transit_feed_booking_rules(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -620,59 +376,10 @@ async def send_general_transit_feed_booking_rules(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: + + async def send_general_transit_feed_static_fare_attributes(self,_feedurl : str, _agencyid : str, data: FareAttributes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareAttributes], str]=None) -> None: """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - - - async def send_general_transit_feed_static_fare_attributes(self,_feedurl : str, _agencyid : str, data: FareAttributes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareAttributes], str]=None) -> None: - """ - Sends the 'GeneralTransitFeedStatic.FareAttributes' event to the Kafka topic + Sends the 'GeneralTransitFeedStatic.FareAttributes' event to the Kafka topic Args: _feedurl(str): Value for placeholder feedurl in attribute source @@ -692,7 +399,7 @@ async def send_general_transit_feed_static_fare_attributes(self,_feedurl : str, attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -702,55 +409,6 @@ async def send_general_transit_feed_static_fare_attributes(self,_feedurl : str, if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_fare_leg_rules(self,_feedurl : str, _agencyid : str, data: FareLegRules, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareLegRules], str]=None) -> None: """ @@ -774,7 +432,7 @@ async def send_general_transit_feed_static_fare_leg_rules(self,_feedurl : str, _ attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -784,55 +442,6 @@ async def send_general_transit_feed_static_fare_leg_rules(self,_feedurl : str, _ if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_fare_media(self,_feedurl : str, _agencyid : str, data: FareMedia, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareMedia], str]=None) -> None: """ @@ -856,7 +465,7 @@ async def send_general_transit_feed_static_fare_media(self,_feedurl : str, _agen attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -866,55 +475,6 @@ async def send_general_transit_feed_static_fare_media(self,_feedurl : str, _agen if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_fare_products(self,_feedurl : str, _agencyid : str, data: FareProducts, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareProducts], str]=None) -> None: """ @@ -938,7 +498,7 @@ async def send_general_transit_feed_static_fare_products(self,_feedurl : str, _a attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -948,55 +508,6 @@ async def send_general_transit_feed_static_fare_products(self,_feedurl : str, _a if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_fare_rules(self,_feedurl : str, _agencyid : str, data: FareRules, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareRules], str]=None) -> None: """ @@ -1020,7 +531,7 @@ async def send_general_transit_feed_static_fare_rules(self,_feedurl : str, _agen attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1030,59 +541,10 @@ async def send_general_transit_feed_static_fare_rules(self,_feedurl : str, _agen if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: + + async def send_general_transit_feed_static_fare_transfer_rules(self,_feedurl : str, _agencyid : str, data: FareTransferRules, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareTransferRules], str]=None) -> None: """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - - - async def send_general_transit_feed_static_fare_transfer_rules(self,_feedurl : str, _agencyid : str, data: FareTransferRules, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FareTransferRules], str]=None) -> None: - """ - Sends the 'GeneralTransitFeedStatic.FareTransferRules' event to the Kafka topic + Sends the 'GeneralTransitFeedStatic.FareTransferRules' event to the Kafka topic Args: _feedurl(str): Value for placeholder feedurl in attribute source @@ -1102,7 +564,7 @@ async def send_general_transit_feed_static_fare_transfer_rules(self,_feedurl : s attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1112,55 +574,6 @@ async def send_general_transit_feed_static_fare_transfer_rules(self,_feedurl : s if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_feed_info(self,_feedurl : str, _agencyid : str, data: FeedInfo, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, FeedInfo], str]=None) -> None: """ @@ -1184,7 +597,7 @@ async def send_general_transit_feed_static_feed_info(self,_feedurl : str, _agenc attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1194,55 +607,6 @@ async def send_general_transit_feed_static_feed_info(self,_feedurl : str, _agenc if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_frequencies(self,_feedurl : str, _agencyid : str, data: Frequencies, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Frequencies], str]=None) -> None: """ @@ -1266,7 +630,7 @@ async def send_general_transit_feed_static_frequencies(self,_feedurl : str, _age attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1276,55 +640,6 @@ async def send_general_transit_feed_static_frequencies(self,_feedurl : str, _age if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_levels(self,_feedurl : str, _agencyid : str, data: Levels, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Levels], str]=None) -> None: """ @@ -1348,7 +663,7 @@ async def send_general_transit_feed_static_levels(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1358,55 +673,6 @@ async def send_general_transit_feed_static_levels(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_location_geo_json(self,_feedurl : str, _agencyid : str, data: LocationGeoJson, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, LocationGeoJson], str]=None) -> None: """ @@ -1430,7 +696,7 @@ async def send_general_transit_feed_static_location_geo_json(self,_feedurl : str attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1440,68 +706,19 @@ async def send_general_transit_feed_static_location_geo_json(self,_feedurl : str if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: + + async def send_general_transit_feed_static_location_groups(self,_feedurl : str, _agencyid : str, data: LocationGroups, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, LocationGroups], str]=None) -> None: """ - Parse the connection string and extract bootstrap server, topic name, username, and password. + Sends the 'GeneralTransitFeedStatic.LocationGroups' event to the Kafka topic Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - - - async def send_general_transit_feed_static_location_groups(self,_feedurl : str, _agencyid : str, data: LocationGroups, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, LocationGroups], str]=None) -> None: - """ - Sends the 'GeneralTransitFeedStatic.LocationGroups' event to the Kafka topic - - Args: - _feedurl(str): Value for placeholder feedurl in attribute source - _agencyid(str): Value for placeholder agencyid in attribute subject - data: (LocationGroups): The event data to be sent - content_type (str): The content type that the event data shall be sent with - flush_producer(bool): Whether to flush the producer after sending the event (default: True) - key_mapper(Callable[[CloudEvent, LocationGroups], str]): A function to map the CloudEvent contents to a Kafka key (default: None). - The default key mapper maps the CloudEvent type, source, and subject to the Kafka key + _feedurl(str): Value for placeholder feedurl in attribute source + _agencyid(str): Value for placeholder agencyid in attribute subject + data: (LocationGroups): The event data to be sent + content_type (str): The content type that the event data shall be sent with + flush_producer(bool): Whether to flush the producer after sending the event (default: True) + key_mapper(Callable[[CloudEvent, LocationGroups], str]): A function to map the CloudEvent contents to a Kafka key (default: None). + The default key mapper maps the CloudEvent type, source, and subject to the Kafka key """ attributes = { "specversion":"1.0", @@ -1512,7 +729,7 @@ async def send_general_transit_feed_static_location_groups(self,_feedurl : str, attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1522,55 +739,6 @@ async def send_general_transit_feed_static_location_groups(self,_feedurl : str, if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_location_group_stores(self,_feedurl : str, _agencyid : str, data: LocationGroupStores, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, LocationGroupStores], str]=None) -> None: """ @@ -1594,7 +762,7 @@ async def send_general_transit_feed_static_location_group_stores(self,_feedurl : attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1604,55 +772,6 @@ async def send_general_transit_feed_static_location_group_stores(self,_feedurl : if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_networks(self,_feedurl : str, _agencyid : str, data: Networks, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Networks], str]=None) -> None: """ @@ -1676,7 +795,7 @@ async def send_general_transit_feed_static_networks(self,_feedurl : str, _agency attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1686,55 +805,6 @@ async def send_general_transit_feed_static_networks(self,_feedurl : str, _agency if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_pathways(self,_feedurl : str, _agencyid : str, data: Pathways, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Pathways], str]=None) -> None: """ @@ -1758,7 +828,7 @@ async def send_general_transit_feed_static_pathways(self,_feedurl : str, _agency attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1768,55 +838,6 @@ async def send_general_transit_feed_static_pathways(self,_feedurl : str, _agency if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_route_networks(self,_feedurl : str, _agencyid : str, data: RouteNetworks, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, RouteNetworks], str]=None) -> None: """ @@ -1840,7 +861,7 @@ async def send_general_transit_feed_static_route_networks(self,_feedurl : str, _ attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1850,55 +871,6 @@ async def send_general_transit_feed_static_route_networks(self,_feedurl : str, _ if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_routes(self,_feedurl : str, _agencyid : str, data: Routes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Routes], str]=None) -> None: """ @@ -1922,7 +894,7 @@ async def send_general_transit_feed_static_routes(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -1932,55 +904,6 @@ async def send_general_transit_feed_static_routes(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_shapes(self,_feedurl : str, _agencyid : str, data: Shapes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Shapes], str]=None) -> None: """ @@ -2004,7 +927,7 @@ async def send_general_transit_feed_static_shapes(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2014,55 +937,6 @@ async def send_general_transit_feed_static_shapes(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_stop_areas(self,_feedurl : str, _agencyid : str, data: StopAreas, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, StopAreas], str]=None) -> None: """ @@ -2084,66 +958,17 @@ async def send_general_transit_feed_static_stop_areas(self,_feedurl : str, _agen "subject":"{agencyid}".format(agencyid = _agencyid) } attributes["datacontenttype"] = content_type - event = CloudEvent.create(attributes, data) - if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) - message.headers[b"content-type"] = b"application/cloudevents+json" - else: - content_type = "application/json" - event["content-type"] = content_type - message = to_binary(event, data_marshaller=lambda x: x.to_byte_array(content_type), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) - self.producer.produce(self.topic, key=message.key, value=message.value, headers=message.headers) - if flush_producer: - self.producer.flush() - - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) + event = CloudEvent.create(attributes, data) + if self.content_mode == "structured": + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message.headers[b"content-type"] = b"application/cloudevents+json" + else: + content_type = "application/json" + event["content-type"] = content_type + message = to_binary(event, data_marshaller=lambda x: x.to_byte_array(content_type), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + self.producer.produce(self.topic, key=message.key, value=message.value, headers=message.headers) + if flush_producer: + self.producer.flush() async def send_general_transit_feed_static_stops(self,_feedurl : str, _agencyid : str, data: Stops, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Stops], str]=None) -> None: @@ -2168,7 +993,7 @@ async def send_general_transit_feed_static_stops(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2178,55 +1003,6 @@ async def send_general_transit_feed_static_stops(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_stop_times(self,_feedurl : str, _agencyid : str, data: StopTimes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, StopTimes], str]=None) -> None: """ @@ -2250,7 +1026,7 @@ async def send_general_transit_feed_static_stop_times(self,_feedurl : str, _agen attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2260,55 +1036,6 @@ async def send_general_transit_feed_static_stop_times(self,_feedurl : str, _agen if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_timeframes(self,_feedurl : str, _agencyid : str, data: Timeframes, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Timeframes], str]=None) -> None: """ @@ -2332,7 +1059,7 @@ async def send_general_transit_feed_static_timeframes(self,_feedurl : str, _agen attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2342,55 +1069,6 @@ async def send_general_transit_feed_static_timeframes(self,_feedurl : str, _agen if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_transfers(self,_feedurl : str, _agencyid : str, data: Transfers, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Transfers], str]=None) -> None: """ @@ -2414,7 +1092,7 @@ async def send_general_transit_feed_static_transfers(self,_feedurl : str, _agenc attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2424,55 +1102,6 @@ async def send_general_transit_feed_static_transfers(self,_feedurl : str, _agenc if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_translations(self,_feedurl : str, _agencyid : str, data: Translations, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Translations], str]=None) -> None: """ @@ -2496,7 +1125,7 @@ async def send_general_transit_feed_static_translations(self,_feedurl : str, _ag attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2506,55 +1135,6 @@ async def send_general_transit_feed_static_translations(self,_feedurl : str, _ag if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'GeneralTransitFeedStaticEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_general_transit_feed_static_trips(self,_feedurl : str, _agencyid : str, data: Trips, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, Trips], str]=None) -> None: """ @@ -2578,7 +1158,7 @@ async def send_general_transit_feed_static_trips(self,_feedurl : str, _agencyid attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -2588,6 +1168,7 @@ async def send_general_transit_feed_static_trips(self,_feedurl : str, _agencyid if flush_producer: self.producer.flush() + @classmethod def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: """ diff --git a/gtfs/xreg/create-kql-script.ps1 b/gtfs/kql/create-kql-script.ps1 similarity index 95% rename from gtfs/xreg/create-kql-script.ps1 rename to gtfs/kql/create-kql-script.ps1 index d3a11e5..1afae9d 100644 --- a/gtfs/xreg/create-kql-script.ps1 +++ b/gtfs/kql/create-kql-script.ps1 @@ -1,6 +1,6 @@ $scriptPath = Split-Path -Parent $PSCommandPath -$jsonFiles = Get-ChildItem -Path "$scriptPath/gtfs-static" -Filter "*.avsc" | Select-Object -ExpandProperty FullName -$gtfsRtFiles = Get-ChildItem -Path "$scriptPath" -Filter "gtfs-rt-*.avsc" | Select-Object -ExpandProperty FullName +$jsonFiles = Get-ChildItem -Path "$scriptPath/../xreg/gtfs-static" -Filter "*.avsc" | Select-Object -ExpandProperty FullName +$gtfsRtFiles = Get-ChildItem -Path "$scriptPath/../xreg/" -Filter "gtfs-rt-*.avsc" | Select-Object -ExpandProperty FullName $jsonFiles += $gtfsRtFiles $outputFile = ".schemas.avsc" diff --git a/gtfs/xreg/gtfs.kql b/gtfs/kql/gtfs.kql similarity index 97% rename from gtfs/xreg/gtfs.kql rename to gtfs/kql/gtfs.kql index 664fe4a..52c9fec 100644 --- a/gtfs/xreg/gtfs.kql +++ b/gtfs/kql/gtfs.kql @@ -2789,10 +2789,10 @@ stop_schedule_relationship = tostring(stop_time_update.schedule_relationship) | extend arrival_delay = toint(arrival.delay), - arrival_time = unixtime_seconds_todatetime(arrival.time), + arrival_time = unixtime_seconds_todatetime(toint(arrival.['time'])), arrival_uncertainty = toint(arrival.uncertainty), departure_delay = toint(departure.delay), - departure_time = unixtime_seconds_todatetime(departure.time), + departure_time = unixtime_seconds_todatetime(toint(departure.['time'])), departure_uncertainty = toint(departure.uncertainty) | extend trip_id = tostring(trip.trip_id), diff --git a/gtfs/xreg/gtfs.xreg.json b/gtfs/xreg/gtfs.xreg.json index 7362036..318395a 100644 --- a/gtfs/xreg/gtfs.xreg.json +++ b/gtfs/xreg/gtfs.xreg.json @@ -12,9 +12,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedRealTime/schemas/GeneralTransitFeedRealTime.Vehicle.VehiclePosition", - "createdat": "2024-09-14T19:33:53.276277", + "createdat": "2024-09-18T14:18:22.193499", "epoch": 0, - "modifiedat": "2024-09-14T19:33:53.276277", + "modifiedat": "2024-09-18T14:18:22.193499", "metadata": { "specversion": { "name": "specversion", @@ -51,9 +51,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedRealTime/schemas/GeneralTransitFeedRealTime.Trip.TripUpdate", - "createdat": "2024-09-14T19:33:56.509005", + "createdat": "2024-09-18T14:18:25.396108", "epoch": 0, - "modifiedat": "2024-09-14T19:33:56.509005", + "modifiedat": "2024-09-18T14:18:25.396108", "metadata": { "specversion": { "name": "specversion", @@ -90,9 +90,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedRealTime/schemas/GeneralTransitFeedRealTime.Alert.Alert", - "createdat": "2024-09-14T19:33:59.755279", + "createdat": "2024-09-18T14:18:28.567740", "epoch": 0, - "modifiedat": "2024-09-14T19:33:59.755279", + "modifiedat": "2024-09-18T14:18:28.567740", "metadata": { "specversion": { "name": "specversion", @@ -124,9 +124,9 @@ } } }, - "createdat": "2024-09-14T19:33:53.274259", + "createdat": "2024-09-18T14:18:22.191092", "epoch": 0, - "modifiedat": "2024-09-14T19:33:53.274259" + "modifiedat": "2024-09-18T14:18:22.191092" }, "GeneralTransitFeedStatic": { "id": "GeneralTransitFeedStatic", @@ -137,9 +137,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Agency", - "createdat": "2024-09-14T19:34:02.945459", + "createdat": "2024-09-18T14:18:31.834679", "epoch": 0, - "modifiedat": "2024-09-14T19:34:02.945459", + "modifiedat": "2024-09-18T14:18:31.834679", "metadata": { "specversion": { "name": "specversion", @@ -176,9 +176,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Areas", - "createdat": "2024-09-14T19:34:06.109457", + "createdat": "2024-09-18T14:18:35.115684", "epoch": 0, - "modifiedat": "2024-09-14T19:34:06.109457", + "modifiedat": "2024-09-18T14:18:35.115684", "metadata": { "specversion": { "name": "specversion", @@ -215,9 +215,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Attributions", - "createdat": "2024-09-14T19:34:09.322877", + "createdat": "2024-09-18T14:18:38.318441", "epoch": 0, - "modifiedat": "2024-09-14T19:34:09.322877", + "modifiedat": "2024-09-18T14:18:38.318441", "metadata": { "specversion": { "name": "specversion", @@ -254,9 +254,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeed.BookingRules", - "createdat": "2024-09-14T19:34:12.495400", + "createdat": "2024-09-18T14:18:41.536027", "epoch": 0, - "modifiedat": "2024-09-14T19:34:12.495400", + "modifiedat": "2024-09-18T14:18:41.536027", "metadata": { "specversion": { "name": "specversion", @@ -293,9 +293,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareAttributes", - "createdat": "2024-09-14T19:34:15.660243", + "createdat": "2024-09-18T14:18:44.864190", "epoch": 0, - "modifiedat": "2024-09-14T19:34:15.660243", + "modifiedat": "2024-09-18T14:18:44.864190", "metadata": { "specversion": { "name": "specversion", @@ -332,9 +332,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareLegRules", - "createdat": "2024-09-14T19:34:18.859990", + "createdat": "2024-09-18T14:18:48.145335", "epoch": 0, - "modifiedat": "2024-09-14T19:34:18.859990", + "modifiedat": "2024-09-18T14:18:48.145335", "metadata": { "specversion": { "name": "specversion", @@ -371,9 +371,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareMedia", - "createdat": "2024-09-14T19:34:22.043899", + "createdat": "2024-09-18T14:18:51.545931", "epoch": 0, - "modifiedat": "2024-09-14T19:34:22.043899", + "modifiedat": "2024-09-18T14:18:51.545931", "metadata": { "specversion": { "name": "specversion", @@ -410,9 +410,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareProducts", - "createdat": "2024-09-14T19:34:25.251924", + "createdat": "2024-09-18T14:18:55.465763", "epoch": 0, - "modifiedat": "2024-09-14T19:34:25.251924", + "modifiedat": "2024-09-18T14:18:55.465763", "metadata": { "specversion": { "name": "specversion", @@ -449,9 +449,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareRules", - "createdat": "2024-09-14T19:34:28.382751", + "createdat": "2024-09-18T14:18:59.022066", "epoch": 0, - "modifiedat": "2024-09-14T19:34:28.382751", + "modifiedat": "2024-09-18T14:18:59.022066", "metadata": { "specversion": { "name": "specversion", @@ -488,9 +488,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FareTransferRules", - "createdat": "2024-09-14T19:34:31.594438", + "createdat": "2024-09-18T14:19:02.517682", "epoch": 0, - "modifiedat": "2024-09-14T19:34:31.594438", + "modifiedat": "2024-09-18T14:19:02.517682", "metadata": { "specversion": { "name": "specversion", @@ -527,9 +527,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.FeedInfo", - "createdat": "2024-09-14T19:34:34.800372", + "createdat": "2024-09-18T14:19:05.869375", "epoch": 0, - "modifiedat": "2024-09-14T19:34:34.800372", + "modifiedat": "2024-09-18T14:19:05.869375", "metadata": { "specversion": { "name": "specversion", @@ -566,9 +566,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Frequencies", - "createdat": "2024-09-14T19:34:37.975904", + "createdat": "2024-09-18T14:19:09.469690", "epoch": 0, - "modifiedat": "2024-09-14T19:34:37.975904", + "modifiedat": "2024-09-18T14:19:09.469690", "metadata": { "specversion": { "name": "specversion", @@ -605,9 +605,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Levels", - "createdat": "2024-09-14T19:34:41.164118", + "createdat": "2024-09-18T14:19:12.723623", "epoch": 0, - "modifiedat": "2024-09-14T19:34:41.164118", + "modifiedat": "2024-09-18T14:19:12.723623", "metadata": { "specversion": { "name": "specversion", @@ -644,9 +644,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.LocationGeoJson", - "createdat": "2024-09-14T19:34:44.385665", + "createdat": "2024-09-18T14:19:16.053215", "epoch": 0, - "modifiedat": "2024-09-14T19:34:44.385665", + "modifiedat": "2024-09-18T14:19:16.053215", "metadata": { "specversion": { "name": "specversion", @@ -683,9 +683,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.LocationGroups", - "createdat": "2024-09-14T19:34:47.584486", + "createdat": "2024-09-18T14:19:19.334478", "epoch": 0, - "modifiedat": "2024-09-14T19:34:47.584486", + "modifiedat": "2024-09-18T14:19:19.334478", "metadata": { "specversion": { "name": "specversion", @@ -722,9 +722,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.LocationGroupStores", - "createdat": "2024-09-14T19:34:50.772270", + "createdat": "2024-09-18T14:19:22.628633", "epoch": 0, - "modifiedat": "2024-09-14T19:34:50.772270", + "modifiedat": "2024-09-18T14:19:22.628633", "metadata": { "specversion": { "name": "specversion", @@ -761,9 +761,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Networks", - "createdat": "2024-09-14T19:34:53.975672", + "createdat": "2024-09-18T14:19:25.926993", "epoch": 0, - "modifiedat": "2024-09-14T19:34:53.975672", + "modifiedat": "2024-09-18T14:19:25.926993", "metadata": { "specversion": { "name": "specversion", @@ -800,9 +800,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Pathways", - "createdat": "2024-09-14T19:34:57.277576", + "createdat": "2024-09-18T14:19:29.178618", "epoch": 0, - "modifiedat": "2024-09-14T19:34:57.277576", + "modifiedat": "2024-09-18T14:19:29.178618", "metadata": { "specversion": { "name": "specversion", @@ -839,9 +839,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.RouteNetworks", - "createdat": "2024-09-14T19:35:00.516165", + "createdat": "2024-09-18T14:19:32.448316", "epoch": 0, - "modifiedat": "2024-09-14T19:35:00.516165", + "modifiedat": "2024-09-18T14:19:32.448316", "metadata": { "specversion": { "name": "specversion", @@ -878,9 +878,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Routes", - "createdat": "2024-09-14T19:35:03.739840", + "createdat": "2024-09-18T14:19:35.714975", "epoch": 0, - "modifiedat": "2024-09-14T19:35:03.739840", + "modifiedat": "2024-09-18T14:19:35.714975", "metadata": { "specversion": { "name": "specversion", @@ -917,9 +917,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Shapes", - "createdat": "2024-09-14T19:35:06.980418", + "createdat": "2024-09-18T14:19:38.989197", "epoch": 0, - "modifiedat": "2024-09-14T19:35:06.980418", + "modifiedat": "2024-09-18T14:19:38.989197", "metadata": { "specversion": { "name": "specversion", @@ -956,9 +956,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.StopAreas", - "createdat": "2024-09-14T19:35:10.194179", + "createdat": "2024-09-18T14:19:42.248833", "epoch": 0, - "modifiedat": "2024-09-14T19:35:10.194179", + "modifiedat": "2024-09-18T14:19:42.248833", "metadata": { "specversion": { "name": "specversion", @@ -995,9 +995,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Stops", - "createdat": "2024-09-14T19:35:13.431326", + "createdat": "2024-09-18T14:19:45.584449", "epoch": 0, - "modifiedat": "2024-09-14T19:35:13.431326", + "modifiedat": "2024-09-18T14:19:45.584449", "metadata": { "specversion": { "name": "specversion", @@ -1034,9 +1034,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.StopTimes", - "createdat": "2024-09-14T19:35:16.641750", + "createdat": "2024-09-18T14:19:48.875701", "epoch": 0, - "modifiedat": "2024-09-14T19:35:16.641750", + "modifiedat": "2024-09-18T14:19:48.875701", "metadata": { "specversion": { "name": "specversion", @@ -1073,9 +1073,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Timeframes", - "createdat": "2024-09-14T19:35:19.852768", + "createdat": "2024-09-18T14:19:52.121539", "epoch": 0, - "modifiedat": "2024-09-14T19:35:19.852768", + "modifiedat": "2024-09-18T14:19:52.121539", "metadata": { "specversion": { "name": "specversion", @@ -1112,9 +1112,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Transfers", - "createdat": "2024-09-14T19:35:23.095140", + "createdat": "2024-09-18T14:19:55.364061", "epoch": 0, - "modifiedat": "2024-09-14T19:35:23.095140", + "modifiedat": "2024-09-18T14:19:55.364061", "metadata": { "specversion": { "name": "specversion", @@ -1151,9 +1151,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Translations", - "createdat": "2024-09-14T19:35:26.333256", + "createdat": "2024-09-18T14:19:58.582820", "epoch": 0, - "modifiedat": "2024-09-14T19:35:26.333256", + "modifiedat": "2024-09-18T14:19:58.582820", "metadata": { "specversion": { "name": "specversion", @@ -1190,9 +1190,9 @@ "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/GeneralTransitFeedStatic/schemas/GeneralTransitFeedStatic.Trips", - "createdat": "2024-09-14T19:35:29.580497", + "createdat": "2024-09-18T14:20:01.902281", "epoch": 0, - "modifiedat": "2024-09-14T19:35:29.580497", + "modifiedat": "2024-09-18T14:20:01.902281", "metadata": { "specversion": { "name": "specversion", @@ -1224,9 +1224,9 @@ } } }, - "createdat": "2024-09-14T19:34:02.940936", + "createdat": "2024-09-18T14:18:31.832628", "epoch": 0, - "modifiedat": "2024-09-14T19:34:02.940936" + "modifiedat": "2024-09-18T14:18:31.832628" } }, "schemagroups": { @@ -1517,9 +1517,9 @@ ], "doc": "Realtime positioning information for a given vehicle." }, - "createdat": "2024-09-14T19:33:52.493022", + "createdat": "2024-09-18T14:18:21.380862", "epoch": 0, - "modifiedat": "2024-09-14T19:33:52.493022" + "modifiedat": "2024-09-18T14:18:21.380862" } } }, @@ -1767,9 +1767,9 @@ ], "doc": "Entities used in the feed. Realtime update of the progress of a vehicle along a trip. Depending on the value of ScheduleRelationship, a TripUpdate can specify: - A trip that proceeds along the schedule. - A trip that proceeds along a route but has no fixed schedule. - A trip that have been added or removed with regard to schedule. The updates can be for future, predicted arrival/departure events, or for past events that already occurred. Normally, updates should get more precise and more certain (see uncertainty below) as the events gets closer to current time. Even if that is not possible, the information for past events should be precise and certain. In particular, if an update points to time in the past but its update's uncertainty is not 0, the client should conclude that the update is a (wrong) prediction and that the trip has not completed yet. Note that the update can describe a trip that is already completed. To this end, it is enough to provide an update for the last stop of the trip. If the time of that is in the past, the client will conclude from that that the whole trip is in the past (it is possible, although inconsequential, to also provide updates for preceding stops). This option is most relevant for a trip that has completed ahead of schedule, but according to the schedule, the trip is still proceeding at the current time. Removing the updates for this trip could make the client assume that the trip is still proceeding. Note that the feed provider is allowed, but not required, to purge past updates - this is one case where this would be practically useful." }, - "createdat": "2024-09-14T19:33:55.674913", + "createdat": "2024-09-18T14:18:24.599797", "epoch": 0, - "modifiedat": "2024-09-14T19:33:55.674913" + "modifiedat": "2024-09-18T14:18:24.599797" } } }, @@ -2074,16 +2074,16 @@ ], "doc": "An alert, indicating some sort of incident in the public transit network." }, - "createdat": "2024-09-14T19:33:58.921048", + "createdat": "2024-09-18T14:18:27.781447", "epoch": 0, - "modifiedat": "2024-09-14T19:33:58.921048" + "modifiedat": "2024-09-18T14:18:27.781447" } } } }, - "createdat": "2024-09-14T19:33:52.491002", + "createdat": "2024-09-18T14:18:21.378335", "epoch": 0, - "modifiedat": "2024-09-14T19:33:52.491002" + "modifiedat": "2024-09-18T14:18:21.378335" }, "GeneralTransitFeedStatic": { "id": "GeneralTransitFeedStatic", @@ -2159,9 +2159,9 @@ } ] }, - "createdat": "2024-09-14T19:34:02.166799", + "createdat": "2024-09-18T14:18:31.020958", "epoch": 0, - "modifiedat": "2024-09-14T19:34:02.166799" + "modifiedat": "2024-09-18T14:18:31.020958" } } }, @@ -2208,9 +2208,9 @@ } ] }, - "createdat": "2024-09-14T19:34:05.329844", + "createdat": "2024-09-18T14:18:34.277303", "epoch": 0, - "modifiedat": "2024-09-14T19:34:05.329844" + "modifiedat": "2024-09-18T14:18:34.277303" } } }, @@ -2324,9 +2324,9 @@ } ] }, - "createdat": "2024-09-14T19:34:08.516068", + "createdat": "2024-09-18T14:18:37.506571", "epoch": 0, - "modifiedat": "2024-09-14T19:34:08.516068" + "modifiedat": "2024-09-18T14:18:37.506571" } } }, @@ -2373,9 +2373,9 @@ } ] }, - "createdat": "2024-09-14T19:34:11.689538", + "createdat": "2024-09-18T14:18:40.761099", "epoch": 0, - "modifiedat": "2024-09-14T19:34:11.689538" + "modifiedat": "2024-09-18T14:18:40.761099" } } }, @@ -2441,9 +2441,9 @@ } ] }, - "createdat": "2024-09-14T19:34:14.878250", + "createdat": "2024-09-18T14:18:44.049950", "epoch": 0, - "modifiedat": "2024-09-14T19:34:14.878250" + "modifiedat": "2024-09-18T14:18:44.049950" } } }, @@ -2508,9 +2508,9 @@ } ] }, - "createdat": "2024-09-14T19:34:18.076686", + "createdat": "2024-09-18T14:18:47.317174", "epoch": 0, - "modifiedat": "2024-09-14T19:34:18.076686" + "modifiedat": "2024-09-18T14:18:47.317174" } } }, @@ -2557,9 +2557,9 @@ } ] }, - "createdat": "2024-09-14T19:34:21.248903", + "createdat": "2024-09-18T14:18:50.603874", "epoch": 0, - "modifiedat": "2024-09-14T19:34:21.248903" + "modifiedat": "2024-09-18T14:18:50.603874" } } }, @@ -2606,9 +2606,9 @@ } ] }, - "createdat": "2024-09-14T19:34:24.455162", + "createdat": "2024-09-18T14:18:54.598494", "epoch": 0, - "modifiedat": "2024-09-14T19:34:24.455162" + "modifiedat": "2024-09-18T14:18:54.598494" } } }, @@ -2668,9 +2668,9 @@ } ] }, - "createdat": "2024-09-14T19:34:27.606118", + "createdat": "2024-09-18T14:18:58.126714", "epoch": 0, - "modifiedat": "2024-09-14T19:34:27.606118" + "modifiedat": "2024-09-18T14:18:58.126714" } } }, @@ -2744,9 +2744,9 @@ } ] }, - "createdat": "2024-09-14T19:34:30.797725", + "createdat": "2024-09-18T14:19:01.555072", "epoch": 0, - "modifiedat": "2024-09-14T19:34:30.797725" + "modifiedat": "2024-09-18T14:19:01.555072" } } }, @@ -2834,9 +2834,9 @@ } ] }, - "createdat": "2024-09-14T19:34:33.988340", + "createdat": "2024-09-18T14:19:05.035722", "epoch": 0, - "modifiedat": "2024-09-14T19:34:33.988340" + "modifiedat": "2024-09-18T14:19:05.035722" } } }, @@ -2884,9 +2884,9 @@ } ] }, - "createdat": "2024-09-14T19:34:37.180477", + "createdat": "2024-09-18T14:19:08.491070", "epoch": 0, - "modifiedat": "2024-09-14T19:34:37.180477" + "modifiedat": "2024-09-18T14:19:08.491070" } } }, @@ -2924,9 +2924,9 @@ } ] }, - "createdat": "2024-09-14T19:34:40.400248", + "createdat": "2024-09-18T14:19:11.941969", "epoch": 0, - "modifiedat": "2024-09-14T19:34:40.400248" + "modifiedat": "2024-09-18T14:19:11.941969" } } }, @@ -2960,9 +2960,9 @@ } ] }, - "createdat": "2024-09-14T19:34:43.596618", + "createdat": "2024-09-18T14:19:15.260010", "epoch": 0, - "modifiedat": "2024-09-14T19:34:43.596618" + "modifiedat": "2024-09-18T14:19:15.260010" } } }, @@ -3009,9 +3009,9 @@ } ] }, - "createdat": "2024-09-14T19:34:46.768025", + "createdat": "2024-09-18T14:19:18.502157", "epoch": 0, - "modifiedat": "2024-09-14T19:34:46.768025" + "modifiedat": "2024-09-18T14:19:18.502157" } } }, @@ -3045,9 +3045,9 @@ } ] }, - "createdat": "2024-09-14T19:34:49.978137", + "createdat": "2024-09-18T14:19:21.813559", "epoch": 0, - "modifiedat": "2024-09-14T19:34:49.978137" + "modifiedat": "2024-09-18T14:19:21.813559" } } }, @@ -3094,9 +3094,9 @@ } ] }, - "createdat": "2024-09-14T19:34:53.196657", + "createdat": "2024-09-18T14:19:25.164529", "epoch": 0, - "modifiedat": "2024-09-14T19:34:53.196657" + "modifiedat": "2024-09-18T14:19:25.164529" } } }, @@ -3203,9 +3203,9 @@ } ] }, - "createdat": "2024-09-14T19:34:56.449561", + "createdat": "2024-09-18T14:19:28.405668", "epoch": 0, - "modifiedat": "2024-09-14T19:34:56.449561" + "modifiedat": "2024-09-18T14:19:28.405668" } } }, @@ -3239,9 +3239,9 @@ } ] }, - "createdat": "2024-09-14T19:34:59.706992", + "createdat": "2024-09-18T14:19:31.633477", "epoch": 0, - "modifiedat": "2024-09-14T19:34:59.706992" + "modifiedat": "2024-09-18T14:19:31.633477" } } }, @@ -3404,9 +3404,9 @@ } ] }, - "createdat": "2024-09-14T19:35:02.942869", + "createdat": "2024-09-18T14:19:34.900235", "epoch": 0, - "modifiedat": "2024-09-14T19:35:02.942869" + "modifiedat": "2024-09-18T14:19:34.900235" } } }, @@ -3454,9 +3454,9 @@ } ] }, - "createdat": "2024-09-14T19:35:06.146186", + "createdat": "2024-09-18T14:19:38.176677", "epoch": 0, - "modifiedat": "2024-09-14T19:35:06.146186" + "modifiedat": "2024-09-18T14:19:38.176677" } } }, @@ -3490,9 +3490,9 @@ } ] }, - "createdat": "2024-09-14T19:35:09.421924", + "createdat": "2024-09-18T14:19:41.443410", "epoch": 0, - "modifiedat": "2024-09-14T19:35:09.421924" + "modifiedat": "2024-09-18T14:19:41.443410" } } }, @@ -3656,9 +3656,9 @@ } ] }, - "createdat": "2024-09-14T19:35:12.622154", + "createdat": "2024-09-18T14:19:44.774269", "epoch": 0, - "modifiedat": "2024-09-14T19:35:12.622154" + "modifiedat": "2024-09-18T14:19:44.774269" } } }, @@ -3818,9 +3818,9 @@ } ] }, - "createdat": "2024-09-14T19:35:15.838889", + "createdat": "2024-09-18T14:19:48.066988", "epoch": 0, - "modifiedat": "2024-09-14T19:35:15.838889" + "modifiedat": "2024-09-18T14:19:48.066988" } } }, @@ -3963,9 +3963,9 @@ } ] }, - "createdat": "2024-09-14T19:35:19.055283", + "createdat": "2024-09-18T14:19:51.301593", "epoch": 0, - "modifiedat": "2024-09-14T19:35:19.055283" + "modifiedat": "2024-09-18T14:19:51.301593" } } }, @@ -4008,9 +4008,9 @@ } ] }, - "createdat": "2024-09-14T19:35:22.302685", + "createdat": "2024-09-18T14:19:54.552806", "epoch": 0, - "modifiedat": "2024-09-14T19:35:22.302685" + "modifiedat": "2024-09-18T14:19:54.552806" } } }, @@ -4049,9 +4049,9 @@ } ] }, - "createdat": "2024-09-14T19:35:25.528325", + "createdat": "2024-09-18T14:19:57.817607", "epoch": 0, - "modifiedat": "2024-09-14T19:35:25.528325" + "modifiedat": "2024-09-18T14:19:57.817607" } } }, @@ -4265,16 +4265,16 @@ } ] }, - "createdat": "2024-09-14T19:35:28.764187", + "createdat": "2024-09-18T14:20:01.085696", "epoch": 0, - "modifiedat": "2024-09-14T19:35:28.764187" + "modifiedat": "2024-09-18T14:20:01.085696" } } } }, - "createdat": "2024-09-14T19:34:02.155037", + "createdat": "2024-09-18T14:18:31.008958", "epoch": 0, - "modifiedat": "2024-09-14T19:34:02.155037" + "modifiedat": "2024-09-18T14:18:31.008958" } } } \ No newline at end of file diff --git a/noaa/CONTAINER.md b/noaa/CONTAINER.md index 56ad4d4..8e8129c 100644 --- a/noaa/CONTAINER.md +++ b/noaa/CONTAINER.md @@ -10,6 +10,12 @@ The National Oceanic and Atmospheric Administration (NOAA) Tides and Currents AP The bridge retrieves data from the NOAA Tides and Currents API and writes it to a Kafka topic as [CloudEvents](https://cloudevents.io/) in a JSON format, which is documented in [EVENTS.md](EVENTS.md). +## Database Schemas and handling + +If you want to build a full data pipeline with all events ingested into +database, the integration with Fabric Eventhouse and Azure Data Explorer is +described in [DATABASE.md](../DATABASE.md). + ## Installing the Container Image Pull the container image from the GitHub Container Registry: diff --git a/noaa/noaa/noaa.kql b/noaa/kql/noaa.kql similarity index 100% rename from noaa/noaa/noaa.kql rename to noaa/kql/noaa.kql diff --git a/noaa/kql/noaa.kql.md b/noaa/kql/noaa.kql.md new file mode 100644 index 0000000..7c4e2e7 --- /dev/null +++ b/noaa/kql/noaa.kql.md @@ -0,0 +1,2 @@ +# noaa/kql/noaa.kql + diff --git a/gtfs/xreg/run-kql-script.ps1 b/noaa/kql/run-kql-script.ps1 similarity index 100% rename from gtfs/xreg/run-kql-script.ps1 rename to noaa/kql/run-kql-script.ps1 diff --git a/noaa/noaa/run-kql-script.ps1 b/noaa/noaa/run-kql-script.ps1 deleted file mode 100644 index 79b4ab7..0000000 --- a/noaa/noaa/run-kql-script.ps1 +++ /dev/null @@ -1,38 +0,0 @@ -param( - [Parameter(Mandatory=$true)] - [string]$clusterUri, - - [Parameter(Mandatory=$true)] - [string]$database, - - [Parameter(Mandatory=$true)] - [string]$script -) - -# Check if kusto.cli is in the path -if (-not (Get-Command -Name "kusto.cli" -ErrorAction SilentlyContinue)) { - # Create a temporary directory - $tempDir = New-Item -ItemType Directory -Path $env:TEMP -Name "kusto_cli_temp" -ErrorAction Stop - - try { - # Download the kusto.cli zip file - $zipFile = Join-Path -Path $tempDir.FullName -ChildPath "kusto_cli.zip" - Invoke-WebRequest -Uri "https://www.nuget.org/api/v2/package/Microsoft.Azure.Kusto.Tools" -OutFile $zipFile -ErrorAction Stop - - # Extract the zip file - Expand-Archive -Path $zipFile -DestinationPath $tempDir.FullName -Force -ErrorAction Stop - - # Run kusto.cli from the tools directory - $toolsDir = Join-Path -Path $tempDir.FullName -ChildPath "tools" - $kustoCliPath = Join-Path -Path $toolsDir -ChildPath "kusto.cli" - & $kustoCliPath "${clusterUri}/${database};fed=true" -script:${script} -linemode:false -keeprunning:false - } - finally { - # Clean up the temporary directory - Remove-Item -Path $tempDir.FullName -Recurse -Force -ErrorAction SilentlyContinue - } -} -else { - # kusto.cli is already in the path, so run it directly - & kusto.cli "${clusterUri}/${database};fed=true" -script:${script} -linemode:false -keeprunning:false -} diff --git a/noaa/noaa/noaa.avsc b/noaa/xreg/noaa.avsc similarity index 100% rename from noaa/noaa/noaa.avsc rename to noaa/xreg/noaa.avsc diff --git a/noaa/noaa/noaa.xreg.json b/noaa/xreg/noaa.xreg.json similarity index 100% rename from noaa/noaa/noaa.xreg.json rename to noaa/xreg/noaa.xreg.json diff --git a/pegelonline/CONTAINER.md b/pegelonline/CONTAINER.md index 32cda96..1215d2c 100644 --- a/pegelonline/CONTAINER.md +++ b/pegelonline/CONTAINER.md @@ -21,6 +21,12 @@ in a JSON format documented in [EVENTS.md](EVENTS.md). You can configure the bridge to handle multiple stations by supplying their identifiers in the configuration. +## Database Schemas and handling + +If you want to build a full data pipeline with all events ingested into +database, the integration with Fabric Eventhouse and Azure Data Explorer is +described in [DATABASE.md](../DATABASE.md). + ## Installing the Container Image Pull the container image from the GitHub Container Registry: diff --git a/pegelonline/kql/create-kql-script.ps1 b/pegelonline/kql/create-kql-script.ps1 new file mode 100644 index 0000000..d7c963a --- /dev/null +++ b/pegelonline/kql/create-kql-script.ps1 @@ -0,0 +1,13 @@ +$scriptPath = Split-Path -Parent $PSCommandPath +$jsonFiles = Get-ChildItem -Path "$scriptPath/../xreg" -Filter "*.avsc" | Select-Object -ExpandProperty FullName +$outputFile = ".schemas.avsc" + +$mergedArray = @() +foreach ($file in $jsonFiles) { + $jsonContent = Get-Content $file -Raw | ConvertFrom-Json + $mergedArray += $jsonContent +} +$mergedArray | ConvertTo-Json -Depth 20 | Out-File $outputFile -Encoding UTF8 + +avrotize a2k $outputFile --emit-cloudevents-dispatch --emit-cloudevents-columns > pegelonline.kql +Remove-Item $outputFile \ No newline at end of file diff --git a/pegelonline/xreg/pegelonline.kql b/pegelonline/kql/pegelonline.kql similarity index 98% rename from pegelonline/xreg/pegelonline.kql rename to pegelonline/kql/pegelonline.kql index 7c0567d..12f7cc7 100644 --- a/pegelonline/xreg/pegelonline.kql +++ b/pegelonline/kql/pegelonline.kql @@ -87,6 +87,8 @@ ``` +.drop materialized-view CurrentMeasurementLatest ifexists; + .create materialized-view with (backfill=true) CurrentMeasurementLatest on table CurrentMeasurement { CurrentMeasurement | summarize arg_max(___time, *) by ___type, ___source, ___subject } @@ -179,6 +181,8 @@ ``` +.drop materialized-view StationLatest ifexists; + .create materialized-view with (backfill=true) StationLatest on table Station { Station | summarize arg_max(___time, *) by ___type, ___source, ___subject } diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/pyproject.toml b/pegelonline/pegelonline_producer/pegelonline_producer_data/pyproject.toml index 8b7a3cc..6aedfe0 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/pyproject.toml +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/pyproject.toml @@ -2,7 +2,7 @@ name = "pegelonline-producer-data" version = "0.1.0" description = "A package for handling Avro schema data" -authors = ["Clemens Vasters"] +authors = ["Your Name"] license = "MIT" packages = [ { include = "pegelonline_producer_data", from="src" } ] diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/currentmeasurement.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/currentmeasurement.py index 64d96f6..321926e 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/currentmeasurement.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/currentmeasurement.py @@ -26,7 +26,7 @@ class CurrentMeasurement: timestamp: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="timestamp")) value: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="value")) stateMnwMhw: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stateMnwMhw")) - stateNswHsw: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stateNswHsw")) + stateNswHsw: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="stateNswHsw")) def __post_init__(self): @@ -89,7 +89,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/station.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/station.py index 69a325f..54baacf 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/station.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/station.py @@ -35,7 +35,7 @@ class Station: agency: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="agency")) longitude: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="longitude")) latitude: float=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="latitude")) - water: Water=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="water")) + water: Water=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="water")) def __post_init__(self): @@ -103,7 +103,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/water.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/water.py index 8739ca8..608708f 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/water.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/src/pegelonline_producer_data/de/wsv/pegelonline/water.py @@ -20,7 +20,7 @@ class Water: longname (str): Full name of the water body (maximum 255 characters).""" shortname: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="shortname")) - longname: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="longname")) + longname: str=dataclasses.field(kw_only=True, metadata=dataclasses_json.config(field_name="longname")) def __post_init__(self): @@ -80,7 +80,9 @@ def to_byte_array(self, content_type_string: str) -> bytes: content_type = content_type_string.split(';')[0].strip() result = None if content_type == 'application/json': + #pylint: disable=no-member result = self.to_json() + #pylint: enable=no-member if result is not None and content_type.endswith('+gzip'): with io.BytesIO() as stream: diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_currentmeasurement.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_currentmeasurement.py index 2f55dfa..b89ade0 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_currentmeasurement.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_currentmeasurement.py @@ -10,6 +10,7 @@ from pegelonline_producer_data.de.wsv.pegelonline.currentmeasurement import CurrentMeasurement + class Test_CurrentMeasurement(unittest.TestCase): """ Test case for CurrentMeasurement @@ -27,11 +28,11 @@ def create_instance(): Create instance of CurrentMeasurement for testing """ instance = CurrentMeasurement( - station_uuid='gsffydwifedxfpaopebq', - timestamp='xmhsqrgkqrqwpvixvgbb', - value=float(6.312102505425477), - stateMnwMhw='scspspytmyexajpyvddy', - stateNswHsw='ijbusiwibplvnhrqjbfx' + station_uuid='bwalxghvlecmmqijrlja', + timestamp='egjchkvyalqznscstcyw', + value=float(99.04357651670534), + stateMnwMhw='mucujvicuwiwizvybvoa', + stateNswHsw='uaykmvloeexwaucoijoz' ) return instance @@ -40,7 +41,7 @@ def test_station_uuid_property(self): """ Test station_uuid property """ - test_value = 'gsffydwifedxfpaopebq' + test_value = 'bwalxghvlecmmqijrlja' self.instance.station_uuid = test_value self.assertEqual(self.instance.station_uuid, test_value) @@ -48,7 +49,7 @@ def test_timestamp_property(self): """ Test timestamp property """ - test_value = 'xmhsqrgkqrqwpvixvgbb' + test_value = 'egjchkvyalqznscstcyw' self.instance.timestamp = test_value self.assertEqual(self.instance.timestamp, test_value) @@ -56,7 +57,7 @@ def test_value_property(self): """ Test value property """ - test_value = float(6.312102505425477) + test_value = float(99.04357651670534) self.instance.value = test_value self.assertEqual(self.instance.value, test_value) @@ -64,7 +65,7 @@ def test_stateMnwMhw_property(self): """ Test stateMnwMhw property """ - test_value = 'scspspytmyexajpyvddy' + test_value = 'mucujvicuwiwizvybvoa' self.instance.stateMnwMhw = test_value self.assertEqual(self.instance.stateMnwMhw, test_value) @@ -72,7 +73,7 @@ def test_stateNswHsw_property(self): """ Test stateNswHsw property """ - test_value = 'ijbusiwibplvnhrqjbfx' + test_value = 'uaykmvloeexwaucoijoz' self.instance.stateNswHsw = test_value self.assertEqual(self.instance.stateNswHsw, test_value) diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_station.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_station.py index 5f8b886..36c3bfd 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_station.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_station.py @@ -11,6 +11,7 @@ from pegelonline_producer_data.de.wsv.pegelonline.station import Station from test_pegelonline_producer_data_de_wsv_pegelonline_water import Test_Water + class Test_Station(unittest.TestCase): """ Test case for Station @@ -28,14 +29,14 @@ def create_instance(): Create instance of Station for testing """ instance = Station( - uuid='dndwfdvhvhoaojulmqgy', - number='opexliaohznjiyllpinj', - shortname='oivbxhufrhkpzaagltqy', - longname='xgfgpogecrlqeatwdndl', - km=float(32.02403314849423), - agency='ymzrvqlmhluvwxwqcymg', - longitude=float(40.99463405832166), - latitude=float(80.06704490885124), + uuid='ygciorrlaoltnvqlkqog', + number='aqiqdzfgqbgivqgnrfhs', + shortname='yoqkvmwiherlqorzygqq', + longname='vkdwfqyyytjjpnkrrfiv', + km=float(8.379605816258096), + agency='zcaurpgpxmmtzjjjxzpi', + longitude=float(75.01678132618342), + latitude=float(0.051411525612299336), water=Test_Water.create_instance() ) return instance @@ -45,7 +46,7 @@ def test_uuid_property(self): """ Test uuid property """ - test_value = 'dndwfdvhvhoaojulmqgy' + test_value = 'ygciorrlaoltnvqlkqog' self.instance.uuid = test_value self.assertEqual(self.instance.uuid, test_value) @@ -53,7 +54,7 @@ def test_number_property(self): """ Test number property """ - test_value = 'opexliaohznjiyllpinj' + test_value = 'aqiqdzfgqbgivqgnrfhs' self.instance.number = test_value self.assertEqual(self.instance.number, test_value) @@ -61,7 +62,7 @@ def test_shortname_property(self): """ Test shortname property """ - test_value = 'oivbxhufrhkpzaagltqy' + test_value = 'yoqkvmwiherlqorzygqq' self.instance.shortname = test_value self.assertEqual(self.instance.shortname, test_value) @@ -69,7 +70,7 @@ def test_longname_property(self): """ Test longname property """ - test_value = 'xgfgpogecrlqeatwdndl' + test_value = 'vkdwfqyyytjjpnkrrfiv' self.instance.longname = test_value self.assertEqual(self.instance.longname, test_value) @@ -77,7 +78,7 @@ def test_km_property(self): """ Test km property """ - test_value = float(32.02403314849423) + test_value = float(8.379605816258096) self.instance.km = test_value self.assertEqual(self.instance.km, test_value) @@ -85,7 +86,7 @@ def test_agency_property(self): """ Test agency property """ - test_value = 'ymzrvqlmhluvwxwqcymg' + test_value = 'zcaurpgpxmmtzjjjxzpi' self.instance.agency = test_value self.assertEqual(self.instance.agency, test_value) @@ -93,7 +94,7 @@ def test_longitude_property(self): """ Test longitude property """ - test_value = float(40.99463405832166) + test_value = float(75.01678132618342) self.instance.longitude = test_value self.assertEqual(self.instance.longitude, test_value) @@ -101,7 +102,7 @@ def test_latitude_property(self): """ Test latitude property """ - test_value = float(80.06704490885124) + test_value = float(0.051411525612299336) self.instance.latitude = test_value self.assertEqual(self.instance.latitude, test_value) diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_water.py b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_water.py index 222f4f2..00e0e75 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_water.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_data/tests/test_pegelonline_producer_data_de_wsv_pegelonline_water.py @@ -10,6 +10,7 @@ from pegelonline_producer_data.de.wsv.pegelonline.water import Water + class Test_Water(unittest.TestCase): """ Test case for Water @@ -27,8 +28,8 @@ def create_instance(): Create instance of Water for testing """ instance = Water( - shortname='nsuwwpjcxzslkxpnyffk', - longname='zpmjrthehltzjkjjgybf' + shortname='uqypljtnvpnnjkfqwhlw', + longname='epfbzebxvmqavbchvxpt' ) return instance @@ -37,7 +38,7 @@ def test_shortname_property(self): """ Test shortname property """ - test_value = 'nsuwwpjcxzslkxpnyffk' + test_value = 'uqypljtnvpnnjkfqwhlw' self.instance.shortname = test_value self.assertEqual(self.instance.shortname, test_value) @@ -45,7 +46,7 @@ def test_longname_property(self): """ Test longname property """ - test_value = 'zpmjrthehltzjkjjgybf' + test_value = 'epfbzebxvmqavbchvxpt' self.instance.longname = test_value self.assertEqual(self.instance.longname, test_value) diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/pyproject.toml b/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/pyproject.toml index 3cfc635..f8a9646 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/pyproject.toml +++ b/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/pyproject.toml @@ -1,7 +1,7 @@ [tool.poetry] name = "pegelonline-producer-kafka-producer" description = "pegelonline_producer_kafka_producer Apache Kafka consumer library" -authors = ["Clemens Vasters "] +authors = ["Your Name "] readme = "README.md" version = "0.1.0" # Placeholder version, dynamic versioning can be handled with plugins packages = [{include = "pegelonline_producer_kafka_producer", from = "src"}] diff --git a/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/src/pegelonline_producer_kafka_producer/producer.py b/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/src/pegelonline_producer_kafka_producer/producer.py index b329c38..aeab02c 100644 --- a/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/src/pegelonline_producer_kafka_producer/producer.py +++ b/pegelonline/pegelonline_producer/pegelonline_producer_kafka_producer/src/pegelonline_producer_kafka_producer/producer.py @@ -60,7 +60,7 @@ async def send_de_wsv_pegelonline_station(self,_feedurl : str, _station_id : str attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -70,55 +70,6 @@ async def send_de_wsv_pegelonline_station(self,_feedurl : str, _station_id : str if flush_producer: self.producer.flush() - @classmethod - def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: - """ - Parse the connection string and extract bootstrap server, topic name, username, and password. - - Args: - connection_string (str): The connection string. - - Returns: - Tuple[Dict[str, str], str]: Kafka config, topic name - """ - config_dict = { - 'security.protocol': 'SASL_SSL', - 'sasl.mechanisms': 'PLAIN', - 'sasl.username': '$ConnectionString', - 'sasl.password': connection_string.strip() - } - kafka_topic = None - try: - for part in connection_string.split(';'): - if 'Endpoint' in part: - config_dict['bootstrap.servers'] = part.split('=')[1].strip( - '"').replace('sb://', '').replace('/', '')+':9093' - elif 'EntityPath' in part: - kafka_topic = part.split('=')[1].strip('"') - except IndexError as e: - raise ValueError("Invalid connection string format") from e - return config_dict, kafka_topic - - @classmethod - def from_connection_string(cls, connection_string: str, topic: typing.Optional[str]=None, content_mode: typing.Literal['structured','binary']='structured') -> 'DeWsvPegelonlineEventProducer': - """ - Create a Kafka producer from a connection string and a topic name. - - Args: - connection_string (str): The connection string. - topic (Optional[str]): The Kafka topic. - content_mode (typing.Literal['structured','binary']): The content mode to use for sending events - - Returns: - Producer: The Kafka producer - """ - config, topic_name = cls.parse_connection_string(connection_string) - if topic: - topic_name = topic - if not topic_name: - raise ValueError("Topic name not found in connection string") - return cls(Producer(config), topic_name, content_mode) - async def send_de_wsv_pegelonline_current_measurement(self,_feedurl : str, _station_id : str, data: CurrentMeasurement, content_type: str = "application/json", flush_producer=True, key_mapper: typing.Callable[[CloudEvent, CurrentMeasurement], str]=None) -> None: """ @@ -142,7 +93,7 @@ async def send_de_wsv_pegelonline_current_measurement(self,_feedurl : str, _stat attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" @@ -152,6 +103,7 @@ async def send_de_wsv_pegelonline_current_measurement(self,_feedurl : str, _stat if flush_producer: self.producer.flush() + @classmethod def parse_connection_string(cls, connection_string: str) -> typing.Tuple[typing.Dict[str, str], str]: """ diff --git a/pegelonline/xreg/create-kql-script.ps1 b/pegelonline/xreg/create-kql-script.ps1 deleted file mode 100644 index 5134101..0000000 --- a/pegelonline/xreg/create-kql-script.ps1 +++ /dev/null @@ -1,11 +0,0 @@ -$jsonFiles = @("currentmeasurement.avsc", "station.avsc") -$outputFile = "schemas.avsc" - -$mergedArray = @() -foreach ($file in $jsonFiles) { - $jsonContent = Get-Content $file -Raw | ConvertFrom-Json - $mergedArray += $jsonContent -} -$mergedArray | ConvertTo-Json -Depth 10 | Out-File $outputFile -Encoding UTF8 - -avrotize a2k schemas.avsc --emit-cloudevents-dispatch --emit-cloudevents-columns > pegelonline.kql diff --git a/pegelonline/xreg/pegelonline.xreg.json b/pegelonline/xreg/pegelonline.xreg.json index 0b6cf48..c2a9f46 100644 --- a/pegelonline/xreg/pegelonline.xreg.json +++ b/pegelonline/xreg/pegelonline.xreg.json @@ -8,14 +8,13 @@ "messages": { "de.wsv.pegelonline.Station": { "id": "de.wsv.pegelonline.Station", - "description": "A PEGELONLINE station with location and water body information.", "format": "CloudEvents/1.0", "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/de.wsv.pegelonline/schemas/de.wsv.pegelonline.Station", - "createdat": "2024-09-10T15:02:55.573631", + "createdat": "2024-09-18T14:22:31.895724", "epoch": 0, - "modifiedat": "2024-09-10T15:02:55.573631", + "modifiedat": "2024-09-18T14:22:31.895724", "metadata": { "specversion": { "name": "specversion", @@ -48,14 +47,13 @@ }, "de.wsv.pegelonline.CurrentMeasurement": { "id": "de.wsv.pegelonline.CurrentMeasurement", - "description": "The current measurement for a PEGELONLINE station.", "format": "CloudEvents/1.0", "binding": "None", "schemaformat": "Avro", "schemaurl": "#/schemagroups/de.wsv.pegelonline/schemas/de.wsv.pegelonline.CurrentMeasurement", - "createdat": "2024-09-10T15:02:58.776148", + "createdat": "2024-09-18T14:22:35.100298", "epoch": 0, - "modifiedat": "2024-09-10T15:02:58.776148", + "modifiedat": "2024-09-18T14:22:35.100298", "metadata": { "specversion": { "name": "specversion", @@ -87,9 +85,9 @@ } } }, - "createdat": "2024-09-10T15:02:55.572645", + "createdat": "2024-09-18T14:22:31.895724", "epoch": 0, - "modifiedat": "2024-09-10T15:02:55.572645" + "modifiedat": "2024-09-18T14:22:31.895724" } }, "schemagroups": { @@ -171,9 +169,9 @@ } ] }, - "createdat": "2024-09-10T15:02:54.786444", + "createdat": "2024-09-18T14:22:31.131859", "epoch": 0, - "modifiedat": "2024-09-10T15:02:54.786444" + "modifiedat": "2024-09-18T14:22:31.131859" } } }, @@ -221,16 +219,16 @@ } ] }, - "createdat": "2024-09-10T15:02:57.965791", + "createdat": "2024-09-18T14:22:34.289227", "epoch": 0, - "modifiedat": "2024-09-10T15:02:57.965791" + "modifiedat": "2024-09-18T14:22:34.289227" } } } }, - "createdat": "2024-09-10T15:02:54.777478", + "createdat": "2024-09-18T14:22:31.116771", "epoch": 0, - "modifiedat": "2024-09-10T15:02:54.777478" + "modifiedat": "2024-09-18T14:22:31.116771" } } } \ No newline at end of file diff --git a/pegelonline/xreg/schemas.avsc b/pegelonline/xreg/schemas.avsc deleted file mode 100644 index a9b7ba0..0000000 --- a/pegelonline/xreg/schemas.avsc +++ /dev/null @@ -1,107 +0,0 @@ -[ - { - "type": "record", - "name": "CurrentMeasurement", - "namespace": "de.wsv.pegelonline", - "doc": "Schema representing the current measurement for a PEGELONLINE station.", - "fields": [ - { - "name": "station_uuid", - "type": "string", - "doc": "Unique immutable identifier of the station." - }, - { - "name": "timestamp", - "type": "string", - "doc": "Timestamp of the current measurement encoded in ISO_8601 format." - }, - { - "name": "value", - "type": "double", - "doc": "Current measured value as a decimal number in the unit defined by the station's timeseries." - }, - { - "name": "stateMnwMhw", - "type": { - "type": "string", - "doc": "State of the current water level compared to mean low water (MNW) and mean high water (MHW). Possible values: 'low', 'normal', 'high', 'unknown', 'commented', 'out-dated'." - } - }, - { - "name": "stateNswHsw", - "type": { - "type": "string", - "doc": "State of the current water level compared to the highest navigable water level (HSW). Possible values: 'normal', 'high', 'unknown', 'commented', 'out-dated'." - } - } - ] - }, - { - "type": "record", - "name": "Station", - "namespace": "de.wsv.pegelonline", - "doc": "Schema representing a PEGELONLINE station with location and water body information.", - "fields": [ - { - "name": "uuid", - "type": "string", - "doc": "Unique immutable identifier of the station." - }, - { - "name": "number", - "type": "string", - "doc": "Station number representing the unique code of the station." - }, - { - "name": "shortname", - "type": "string", - "doc": "Short name of the station (maximum 40 characters)." - }, - { - "name": "longname", - "type": "string", - "doc": "Full name of the station (maximum 255 characters)." - }, - { - "name": "km", - "type": "double", - "doc": "River kilometer marking of the station location." - }, - { - "name": "agency", - "type": "string", - "doc": "Waterways and Shipping Office responsible for the station." - }, - { - "name": "longitude", - "type": "double", - "doc": "Longitude coordinate of the station in WGS84 decimal notation." - }, - { - "name": "latitude", - "type": "double", - "doc": "Latitude coordinate of the station in WGS84 decimal notation." - }, - { - "name": "water", - "type": { - "type": "record", - "name": "Water", - "doc": "Details of the water body associated with the station.", - "fields": [ - { - "name": "shortname", - "type": "string", - "doc": "Short name of the water body (maximum 40 characters)." - }, - { - "name": "longname", - "type": "string", - "doc": "Full name of the water body (maximum 255 characters)." - } - ] - } - } - ] - } -] diff --git a/rss/CONTAINER.md b/rss/CONTAINER.md index 8bc5efe..2c03e0e 100644 --- a/rss/CONTAINER.md +++ b/rss/CONTAINER.md @@ -24,6 +24,12 @@ Kafka topic as [CloudEvents](https://cloudevents.io/) in a JSON format, which is documented in [EVENTS.md](EVENTS.md). You can specify multiple feed URLs by providing them in the configuration. +## Database Schemas and handling + +If you want to build a full data pipeline with all events ingested into +database, the integration with Fabric Eventhouse and Azure Data Explorer is +described in [DATABASE.md](../DATABASE.md). + ## Installing the Container Image Pull the container image from the GitHub Container Registry: diff --git a/rss/xreg/create-kql-script.ps1 b/rss/kql/create-kql-script.ps1 similarity index 81% rename from rss/xreg/create-kql-script.ps1 rename to rss/kql/create-kql-script.ps1 index 444ad8d..ad5a141 100644 --- a/rss/xreg/create-kql-script.ps1 +++ b/rss/kql/create-kql-script.ps1 @@ -1,6 +1,7 @@ $scriptDir = Split-Path -Parent $PSCommandPath -$inputFile = Join-Path $scriptDir "feeds.xreg.json" -$outputFile = Join-Path $scriptDir "schemas.avsc" +$inputFile = Join-Path $scriptDir "../xreg/feeds.xreg.json" +$outputFile = Join-Path $scriptDir "../xreg/schemas.avsc" +$kqlFile = Join-Path $scriptDir "feeds.kql" # Load the JSON content $jsonContent = Get-Content $inputFile -Raw | ConvertFrom-Json @@ -21,5 +22,5 @@ foreach ($schemagroup in $jsonContent.schemagroups.psobject.Properties.value) { # Output the merged array to the output file $mergedArray | ConvertTo-Json -Depth 30 | Out-File $outputFile -Encoding UTF8 -avrotize a2k $outputFile --emit-cloudevents-dispatch --emit-cloudevents-columns > feeds.kql +avrotize a2k $outputFile --emit-cloudevents-dispatch --emit-cloudevents-columns > $kqlFile diff --git a/rss/xreg/feeds.kql b/rss/kql/feeds.kql similarity index 100% rename from rss/xreg/feeds.kql rename to rss/kql/feeds.kql diff --git a/rss/rssbridge_producer/rssbridge_producer_kafka_producer/src/rssbridge_producer_kafka_producer/producer.py b/rss/rssbridge_producer/rssbridge_producer_kafka_producer/src/rssbridge_producer_kafka_producer/producer.py index 6b9c62c..ad06d63 100644 --- a/rss/rssbridge_producer/rssbridge_producer_kafka_producer/src/rssbridge_producer_kafka_producer/producer.py +++ b/rss/rssbridge_producer/rssbridge_producer_kafka_producer/src/rssbridge_producer_kafka_producer/producer.py @@ -58,7 +58,7 @@ async def send_microsoft_open_data_rss_feeds_feed_item(self,_sourceurl : str, _i attributes["datacontenttype"] = content_type event = CloudEvent.create(attributes, data) if self.content_mode == "structured": - message = to_structured(event, data_marshaller=lambda x: x.to_json(), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) + message = to_structured(event, data_marshaller=lambda x: json.loads(x.to_json()), key_mapper=lambda x: self.__key_mapper(x, data, key_mapper)) message.headers[b"content-type"] = b"application/cloudevents+json" else: content_type = "application/json" diff --git a/rss/xreg/run-kql-script.ps1 b/rss/xreg/run-kql-script.ps1 deleted file mode 100644 index 79b4ab7..0000000 --- a/rss/xreg/run-kql-script.ps1 +++ /dev/null @@ -1,38 +0,0 @@ -param( - [Parameter(Mandatory=$true)] - [string]$clusterUri, - - [Parameter(Mandatory=$true)] - [string]$database, - - [Parameter(Mandatory=$true)] - [string]$script -) - -# Check if kusto.cli is in the path -if (-not (Get-Command -Name "kusto.cli" -ErrorAction SilentlyContinue)) { - # Create a temporary directory - $tempDir = New-Item -ItemType Directory -Path $env:TEMP -Name "kusto_cli_temp" -ErrorAction Stop - - try { - # Download the kusto.cli zip file - $zipFile = Join-Path -Path $tempDir.FullName -ChildPath "kusto_cli.zip" - Invoke-WebRequest -Uri "https://www.nuget.org/api/v2/package/Microsoft.Azure.Kusto.Tools" -OutFile $zipFile -ErrorAction Stop - - # Extract the zip file - Expand-Archive -Path $zipFile -DestinationPath $tempDir.FullName -Force -ErrorAction Stop - - # Run kusto.cli from the tools directory - $toolsDir = Join-Path -Path $tempDir.FullName -ChildPath "tools" - $kustoCliPath = Join-Path -Path $toolsDir -ChildPath "kusto.cli" - & $kustoCliPath "${clusterUri}/${database};fed=true" -script:${script} -linemode:false -keeprunning:false - } - finally { - # Clean up the temporary directory - Remove-Item -Path $tempDir.FullName -Recurse -Force -ErrorAction SilentlyContinue - } -} -else { - # kusto.cli is already in the path, so run it directly - & kusto.cli "${clusterUri}/${database};fed=true" -script:${script} -linemode:false -keeprunning:false -} diff --git a/tools/clean-user-path.ps1 b/tools/clean-user-path.ps1 new file mode 100644 index 0000000..6988358 --- /dev/null +++ b/tools/clean-user-path.ps1 @@ -0,0 +1,32 @@ +<# +.SYNOPSIS + Removes user-scoped PATH entries that already exist in the machine-scoped PATH. + +.DESCRIPTION + This script compares the user-scoped PATH and machine-scoped PATH environment variables. + Any entries in the user-scoped PATH that are also present in the machine-scoped PATH will be removed from the user scope. + +.NOTES + This script requires PowerShell to run with sufficient privileges to modify user environment variables. + It does not require Administrator privileges, as it only modifies the user scope. + +.EXAMPLE + ./Clean-UserPath.ps1 +#> + +# Get the machine-scoped and user-scoped PATH environment variables +$machinePath = [System.Environment]::GetEnvironmentVariable("Path", [System.EnvironmentVariableTarget]::Machine) -split ";" +$userPath = [System.Environment]::GetEnvironmentVariable("Path", [System.EnvironmentVariableTarget]::User) -split ";" + +# Find all elements in the user PATH that are also present in the machine PATH +$cleanedUserPath = $userPath | Where-Object { $_ -and -not ($machinePath -contains $_) } + +# Join the cleaned user path back into a single string +$updatedUserPath = ($cleanedUserPath -join ";").TrimEnd(";") + +# Set the new user PATH +[System.Environment]::SetEnvironmentVariable("Path", $updatedUserPath, [System.EnvironmentVariableTarget]::User) + +# Output the result +Write-Host "User PATH cleaned. Redundant entries removed." +Write-Host "Updated user PATH: $updatedUserPath" diff --git a/tools/install-avrotize.ps1 b/tools/install-avrotize.ps1 new file mode 100644 index 0000000..5a15b26 --- /dev/null +++ b/tools/install-avrotize.ps1 @@ -0,0 +1,74 @@ +# Get the user's profile directory +$userProfile = [System.Environment]::GetFolderPath('UserProfile') + +# Define the path for the virtual environment +$venvPath = Join-Path $userProfile "avrotize" + +# Define the path for the batch wrapper script +$avrotizeBatchPath = Join-Path $userProfile "avrotize.bat" + +# Check if Python is installed +$pythonPath = Get-Command python -ErrorAction SilentlyContinue +if (-not $pythonPath) { + Write-Host "Python is not installed. Installing Python using winget..." + + # Install Python using winget + winget install Python.Python.3 + if ($?) { + Write-Host "Python installed successfully." + } else { + Write-Host "Python installation failed." + exit + } +} + +# Check if pip is installed +$pipPath = Get-Command pip -ErrorAction SilentlyContinue +if (-not $pipPath) { + Write-Host "pip is not installed. Installing pip..." + python -m ensurepip --upgrade # Ensures pip is installed + if ($?) { + Write-Host "pip has been installed successfully." + } else { + Write-Host "Failed to install pip. Please install it manually." + exit + } +} + +# Create a virtual environment in the "avrotize" subdir of the user profile +if (-not (Test-Path $venvPath)) { + Write-Host "Creating a virtual environment in $venvPath" + python -m venv $venvPath +} + +# Activate the virtual environment and install avrotize +$activateScript = "$venvPath\Scripts\Activate.ps1" +if (Test-Path $activateScript) { + Write-Host "Activating virtual environment and installing avrotize..." + & $activateScript + pip install avrotize + Write-Host "avrotize has been installed successfully in the virtual environment." +} else { + Write-Host "Failed to activate virtual environment." + exit +} + +# Create a batch file wrapper for avrotize +$batchCommand = @" +@echo off +call "$venvPath\Scripts\activate.bat" +python -m avrotize %* +"@ + +# Write the avrotize wrapper batch file to the user profile directory +Write-Host "Creating a batch file wrapper at $avrotizeBatchPath..." +$batchCommand | Out-File -FilePath $avrotizeBatchPath -Encoding UTF8 + +# Make the script globally executable by adding its path to the PATH environment variable +$pathEnv = [System.Environment]::GetEnvironmentVariable("Path", [System.EnvironmentVariableTarget]::User) +if (-not $pathEnv.Contains($userProfile)) { + Write-Host "Adding $userProfile to PATH..." + [System.Environment]::SetEnvironmentVariable("Path", "$pathEnv;$userProfile", [System.EnvironmentVariableTarget]::User) +} + +Write-Host "Setup complete. You can now run 'avrotize' from any PowerShell or command prompt session." diff --git a/tools/install-kusto-cli.ps1 b/tools/install-kusto-cli.ps1 new file mode 100644 index 0000000..85da713 --- /dev/null +++ b/tools/install-kusto-cli.ps1 @@ -0,0 +1,86 @@ +<# +.SYNOPSIS + Automates the installation of Kusto CLI by installing the Microsoft.Azure.Kusto.Tools package and extracting the required files. + +.DESCRIPTION + This script installs the Kusto CLI by downloading and extracting the Microsoft.Azure.Kusto.Tools package. + For Windows PowerShell, it uses the net472 version of the CLI tool. + For PowerShell 7+, it uses the net6.0 version. + +.NOTES + Elevated privileges (run as Administrator) are not required unless permissions are needed to modify the PATH. + +.EXAMPLE + ./Install-KustoCLI.ps1 +#> + + +# Step 0: Check if Kusto CLI is already installed and available +if (Get-Command kusto.cli -ErrorAction SilentlyContinue) { + Write-Host "Kusto CLI is already installed. Exiting..." + exit 0 +} + +# Ensure NuGet provider is available for Install-Package +Write-Host "Checking for the NuGet package provider..." +$nugetProvider = Get-PackageProvider -Name "NuGet" -ErrorAction SilentlyContinue + +if (-not $nugetProvider) { + Write-Host "NuGet provider not found. Installing NuGet provider..." + Install-PackageProvider -Name "NuGet" -Force -Scope CurrentUser +} + +# Step 1: Install Kusto CLI via Install-Package +Write-Host "Installing Kusto CLI..." + +# Define target folder for Kusto CLI +$targetFolder = "$env:USERPROFILE\KustoCLI" # You can change this to any desired path + +# Install the Microsoft.Azure.Kusto.Tools package via Install-Package +Install-Package -Name "Microsoft.Azure.Kusto.Tools" -Source "nuget.org" -ProviderName "NuGet" -Destination $targetFolder -Force + +# Step 2: Determine the correct tools directory based on PowerShell version +$psVersion = $PSVersionTable.PSVersion.Major + +# Find the actual tools folder by searching for the first match of the package version +$toolsFolder = Get-ChildItem -Path (Join-Path -Path $targetFolder -ChildPath "Microsoft.Azure.Kusto.Tools*") -Directory | Select-Object -First 1 +if (-not $toolsFolder) { + Write-Host "Failed to locate the tools folder. Exiting..." + exit 1 +} +$toolsFolder = $toolsFolder.FullName + +if ($psVersion -ge 7) { + # PowerShell 7+ uses the net6.0 directory + $toolsFolder = Join-Path -Path $toolsFolder -ChildPath "tools\net6.0" + Write-Host "PowerShell 7+ detected. Using net6.0 version of the Kusto CLI." +} else { + # Windows PowerShell uses the net472 directory + $toolsFolder = Join-Path -Path $toolsFolder -ChildPath "tools\net472" + Write-Host "Windows PowerShell detected. Using net472 version of the Kusto CLI." +} + +# Check if the tools folder exists +if (-not (Test-Path $toolsFolder)) { + Write-Host "Failed to locate the tools folder. Exiting..." + exit 1 +} + +# Step 3: Add the tools folder to PATH if not already present +$userPath = [Environment]::GetEnvironmentVariable("Path", [EnvironmentVariableTarget]::User) +if (-not $userPath.Contains($toolsFolder)) { + Write-Host "Adding Kusto CLI tools folder to PATH..." + [Environment]::SetEnvironmentVariable("Path", $userPath + ";$toolsFolder", [EnvironmentVariableTarget]::User) + Write-Host "Please restart your terminal for changes to take effect." +} else { + Write-Host "Kusto CLI tools folder is already in your PATH." +} + +# step 4: update this session's path if the folder is not in the path +if (-not $env:PATH.Contains($toolsFolder)) { + Write-Host "Adding Kusto CLI tools folder to PATH for this session..." + $env:PATH += ";$toolsFolder" +} + + +Write-Host "Installation complete. The Kusto CLI tool is now ready for use." diff --git a/tools/media/eventhouse-details.png b/tools/media/eventhouse-details.png new file mode 100644 index 0000000..5141dc7 Binary files /dev/null and b/tools/media/eventhouse-details.png differ diff --git a/tools/media/get-data.png b/tools/media/get-data.png new file mode 100644 index 0000000..291d3c7 Binary files /dev/null and b/tools/media/get-data.png differ diff --git a/tools/media/inspect-data.png b/tools/media/inspect-data.png new file mode 100644 index 0000000..cad4ed7 Binary files /dev/null and b/tools/media/inspect-data.png differ diff --git a/tools/media/pick-destination.png b/tools/media/pick-destination.png new file mode 100644 index 0000000..fa42c93 Binary files /dev/null and b/tools/media/pick-destination.png differ diff --git a/tools/run-kql-script.ps1 b/tools/run-kql-script.ps1 new file mode 100644 index 0000000..231e42c --- /dev/null +++ b/tools/run-kql-script.ps1 @@ -0,0 +1,19 @@ +param( + [Parameter(Mandatory=$true)] + [string]$clusterUri, + + [Parameter(Mandatory=$true)] + [string]$database, + + [Parameter(Mandatory=$true)] + [string]$script +) + +# Check if kusto.cli is in the path +if (-not (Get-Command -Name "kusto.cli" -ErrorAction SilentlyContinue)) { + # Call the install script + $installScriptPath = Join-Path -Path (Split-Path -Path $MyInvocation.MyCommand.Path -Parent) -ChildPath "install-kusto-cli.ps1" + & $installScriptPath +} + +kusto.cli "${clusterUri}/${database};fed=true" -script:${script} -linemode:false -keeprunning:false