Reputation: 12496
I have a Azure Stream Analytics job that receives some raw events, transforms them and then writes them to some different outputs. I get the following error:
Maximum Event Hub receivers exceeded. Only 5 receivers per partition are allowed. Please use dedicated consumer group(s) for this input. If there are multiple queries using same input, share your input using WITH clause.
This is weird, because I use a common table expression (the WITH clause) at the beginning to get all the data, and then I don't access the event hub anymore. Here is the query:
WITH
ODSMeasurements AS (
SELECT
collectionTimestamp,
analogValues,
digitalValues,
type,
translationTable
FROM EventhubODSMeasurements
),
-- Combine analog and digital measurements
CombineAnalogAndDigital AS (
SELECT
CAST(CONCAT(SUBSTRING(ODS.collectionTimestamp, 1, 10), ' ', SUBSTRING(ODS.collectionTimestamp, 12, 12)) AS datetime) AS "TimeStamp",
ROUND(AV.PropertyValue.value / (CAST(TT.ConversionFactor AS float)), 5) AS "ValueNumber",
NULL AS "ValueBit",
CAST(TT.MeasurementTypeId AS bigint) AS "MeasurementTypeId",
TT.MeasurementTypeName AS "MeasurementName",
TT.PartName AS "PartName",
CAST(TT.ElementId AS bigint) AS "ElementId",
TT.ElementName AS "ElementName",
TT.ObjectName AS "ObjectName",
TT.LocationName AS "LocationName",
CAST(TT.TranslationTableId AS bigint) AS "TranslationTableId",
ODS.Type AS "Status"
FROM ODSMeasurements ODS
CROSS APPLY GetRecordProperties(analogValues) AS AV
INNER JOIN SQLTranslationTable TT
ON
TT.Tag = AV.PropertyName AND
CAST(TT.Version as bigint) = ODS.translationTable.version AND
TT.Name = ODS.translationTable.name
UNION
SELECT
CAST(CONCAT(SUBSTRING(ODS.collectionTimestamp, 1, 10), ' ', SUBSTRING(ODS.collectionTimestamp, 12, 12)) AS datetime) AS "TimeStamp",
CAST(-9999.00000 AS float) AS "ValueNumber",
CAST(DV.PropertyValue.value AS nvarchar(max)) AS "ValueBit",
CAST(TT.MeasurementTypeId AS bigint) AS "MeasurementTypeId",
TT.MeasurementTypeName AS "MeasurementName",
TT.PartName AS "PartName",
CAST(TT.ElementId AS bigint) AS "ElementId",
TT.ElementName AS "ElementName",
TT.ObjectName AS "ObjectName",
TT.LocationName AS "LocationName",
CAST(TT.TranslationTableId AS bigint) AS "TranslationTableId",
ODS.Type AS "Status"
FROM ODSMeasurements ODS
CROSS APPLY GetRecordProperties(digitalValues) AS DV
INNER JOIN SQLTranslationTable TT
ON
TT.Tag = DV.PropertyName AND
CAST(TT.Version as bigint) = ODS.translationTable.version AND
TT.Name = ODS.translationTable.name
)
-- Output data
SELECT *
INTO DatalakeHarmonizedMeasurements
FROM CombineAnalogAndDigital
PARTITION BY TranslationTableId
SELECT *
INTO FunctionsHarmonizedMeasurements
FROM CombineAnalogAndDigital
SELECT Timestamp, ValueNumber, CAST(ValueBit AS bit) AS ValueBit, ElementId, MeasurementTypeId, CAST(TranslationTableId AS bigint) AS TranslationTableId
INTO SQLRealtimeMeasurements
FROM CombineAnalogAndDigital
SELECT *
INTO EventHubHarmonizedMeasurements
FROM CombineAnalogAndDigital
PARTITION BY TranslationTableId
And this is the event hub input that I use:
{
"Name": "EventhubODSMeasurements",
"Type": "Data Stream",
"DataSourceType": "Event Hub",
"EventHubProperties": {
"ServiceBusNamespace": "xxx",
"EventHubName": "xxx",
"SharedAccessPolicyName": "xxx",
"SharedAccessPolicyKey": null,
"ConsumerGroupName": "streamanalytics",
"AuthenticationMode": "ConnectionString"
},
"DataSourceCredentialDomain": "xxx",
"Serialization": {
"Type": "Json",
"Encoding": "UTF8"
},
"PartitionKey": null,
"CompressionType": "None",
"ScriptType": "Input"
}
I use a separate consumer group for this as well. As far as I see it, I do everything right. Does anyone know what's up?
Edit: I enable the diagnostic logs, and it says this:
Exceeded the maximum number of allowed receivers per partition in a consumer group which is 5. List of connected receivers - nil, nil, nil, nil, nil.
Upvotes: 0
Views: 511
Reputation: 12496
Turns out the issue was PEBKAC. There is another Job that accidentally pointed to the same input Event Hub.
Upvotes: 1