Hello and welcome back to Microsoft Q&A @Anonymous .
Data gets pushed to Event hubs. Event hubs does not pull the data for you.
This means you will need to write a producer, and run the producer on a schedule.
Alternatively you could write a long-running producer that waits between batches.
There a number of languages available. For this example, I will build off the python quickstart.
import asyncio
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData
from time import sleep
#import urllib
async def run():
# Create a producer client to send messages to the event hub.
# Specify a connection string to your event hubs namespace and
# the event hub name.
producer = EventHubProducerClient.from_connection_string(conn_str="EVENT HUBS NAMESPACE - CONNECTION STRING", eventhub_name="EVENT HUB NAME")
async with producer:
# Create a batch.
event_data_batch = await producer.create_batch()
#get the data
#use pycurl http://pycurl.io/
#or use liburl
# or use requests https://requests.readthedocs.io/en/master/
data = request_data(url)
# Add events to the batch.
for record in data:
event_data_batch.add(EventData(record))
# Send the batch of events to the event hub.
await producer.send_batch(event_data_batch)
# make the process wait for 4 hours between iterations
sleep(4*60*60)
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
I would also like to note the data you shared above appears to be dimension data defining the regions of world, rather than event data describing the weather. This type of data changes very slowly / infrequently. It would be better used in a reference table in Stream Analytics.