I'm working with the RealEstateCore models, and I'm trying to upload them in the least number of API calls. I keep track of what dependencies are needed and only upload models that depend on nothing that hasn't been uploaded yet. At some point, I try to upload a batch of 79 models, and I'm confronted with the following error:
None of the models in this request could be created due to a problem with one or more models: DtmiResolver failed to resolve requisite references to element(s): dtmi:digitaltwins:rec_3_3:device:AccelerationSensor;1 dtmi:digitaltwins:rec_3_3:device:AirQualitySensor;1 dtmi:digitaltwins:rec_3_3:device:CurrentSensor;1 dtmi:digitaltwins:rec_3_3:device:DistanceSensor;1 dtmi:digitaltwins:rec_3_3:device:EnergySensor;1 dtmi:digitaltwins:rec_3_3:device:EnthalpySensor;1 <REDACTED 23 OTHER MODELS> See model documentation(http://aka.ms/ADTv2Models) for supported format. Status: 400 (Bad Request) ErrorCode: DTDLParserError
I assume, from the description, that it's missing some dependencies. But when I use the API to check on the list of models, they all exist. Does this error mean something else instead?
The 79 models are good for a body size of 30745 bytes, which should fit within the limit of 32KB as per the service limits. I'm a bit confused because the service limits also mention the maximum size of a JSON body for a single model is 1MB.
I'm staying well within the limit of 100 requests per second, as this is the 10th call I make.
Uploading fewer models in 1 call works when I set the maximum body size to 25000 bytes.