Forum Sensor.Community

R-Script or tips to download and use sensor data

Hello everyone,

I am looking for tips to download easily sensor data from a specific country (Belgium) and some periods but I am so difficulty to find it on the https://archive.sensor.community/. I don’t understand properly how find that as there is not really filter to do a specific research of data.

Moreover, I am looking for data to do a paper for my university studies. So I would like to know if anyone has an R-script of something similar to use the .csv data file to do some basic statistics for my project ? For example to calculate and work with Excel or R-studio once the data is downloaded.

Thanks so much for your help,

Bayens M.

Hi,
short after your first mail regarding this question someone send this link:

2 Likes

Thanks so much for contributing !
This paper looks very interesting for my project.

I would do:

  1. load the current API data in order to get the list of all the sensors as json: https://data.sensor.community/static/v1/data.json

  2. import the data in QGIS: Download QGIS

  3. get a shapefile of Belgium and import it in QGIS

  4. use the intersection algorithm in QGIS in order to get the sensor IDs in Belgium

  5. write a simple script in python to download in the archives according to IDs and dates :

    import requests
    #Mettre les ID des capteurs dans le tableau séparées par des virgules
    sensor_id = []
    #Mettre les dates dans le tableau au format 'YYYY-MM-DD' séparées par des virgules
    dates = []
    url_deb = 'https://archive.sensor.community/'
    
    for n1 in range(0,len(dates)):
        date = dates[n1]
        url_ok = url_deb + date
        r1 = requests.get(url_ok)
        source_code = r1.text
    
        for n2 in range(0,len(sensor_id)):
            test = 'sensor_'+str(sensor_id[n2])+'.csv'
         
            if test in source_code:
                split1 = source_code.split(test)[0]
                split2 = split1.split('<a href="')[-1]
                url_fin = url_ok + '/' + split2 + test
                r2 = requests.get(url_fin)
                data = r2.text
                #Les données vont s'afficher dans le terminal. 
                print(data)
1 Like

I guess your question has been answered to the point of your definition.
But as allways, there is a way by foot: Get the csv-File, run it through a search for all LAT/LONG between Most-North-BE/Most-South-BE, Most-West-BE Most-East-BE, write line into a new CSV-File. Now that the file has a reasonable size, you may import the CSV into the EXCEL/ACCESS Database Engine (Data-handling is the same, EXCEL/ACCESS is the front-end)
If you want to do more analysis, using SQL may be your partner.
After you have figured first sensors outside the scope, it’ll be routine to delete those outside the borders.
As I mentioned: It is a way by foot and it creates some ‘Transpiration’ but it is a way.

1 Like

hi all
I’ve developed a small ShinyApp to download of historical data from sensors. The app also makes a brief analisys of data using the OpenAir package.
You can find the source code of the App here
Maybe we can add a country selector and translate it if you find it usefull
Saludos