Data by county?

Hi,

Does anyone know if there is a way to download data by county? I am currently hoping to download all the data for each monitor within Sheffield and was wondering what the best way is to do this? I was hoping to find the data classed either by county or even sensor name instead of by date as I would like to download all the data (so all the dates) for each sensor in Sheffield.

Hope that makes sense

Hello @Niamh, there are only files with the data of all sensors per month and sensor type (i.e. all SDS011) worldwide. You can find them at https://archive.sensor.community/csv_per_month/

You can intersect the api data in a GIS software like QGiS to get the list of sensors in Sheffield.
Then you can use some script to download the data for these sensors. Examples in this forum.

Thank you for your reply. I am new to this so not sure how to work with the json data as I’ve never seen this before. How do I get the list of all the sensors I need using json?

Also is there a list somewhere for all the sensors in each region?

Many thanks again

I check something and give you tipps soon.

I explain step by step.

1-Get the last data as json

https://data.sensor.community/static/v2/data.dust.min.json

2-Convert json to csv

3-Open the csv in Excel

Keep the column:
location/latitude location/longitude location/country location/indoor sensor/id sensor/sensor_type/name

Save as csv.

4-Open QGIS

Import the data.

Screen Shot 2024-01-20 at 18.53.27

5-Find a shapefile of Sheffield

6-Intersect

You know has the list of sensors in Sheffield

1 Like

21103
21258
21987
22019
24153
24409
25611
28016
28852
29078
29602
31722
41897
54561
55619
56688
60245
61081
62183
62532
65735
67997
68953
68956
69003
69752
70477
7189
73919
74377
74857
74858
75671
77752
77754
77758
7776
77767
78099
83024

Now you can download from the archive with the instructions in this forum.

Hi, thank you so much for this step by step guide, I really appreciate it, I will give it a go. Once I have done the above, how would I use that information to download only the data for the specific sites? As I already have a list of the Sheffield sites but wasnt sure how to just download only their specific data

import requests
import time
#Mettre les ID des capteurs dans le tableau séparées par des virgules
#SDS11 only here
sensor_id = ["77136","77144","77445","33624","61705"]
#Mettre les dates dans le tableau au format 'YYYY-MM-DD' séparées par des virgules
dates = ["2022-01-01","2022-01-02","2022-01-03","2022-01-04","2022-01-05","2022-01-06","2022-01-07","2022-01-08","2022-01-09","2022-01-10","2022-01-11","2022-01-12","2022-01-13","2022-01-14","2022-01-15","2022-01-16","2022-01-17","2022-01-18","2022-01-19","2022-01-20","2022-01-21","2022-01-22"]

url_deb = 'http://archive.sensor.community/2022/'

for n1 in range(0,len(dates)):

    date = dates[n1]
    url_ok = url_deb + date
    print(url_ok)
    r1 = requests.get(url_ok)
    source_code = r1.text
    data_to_write ='sensor_id;sensor_type;location;lat;lon;timestamp;P1;durP1;ratioP1;P2;durP2;ratioP2\n'

    for n2 in range(0,len(sensor_id)):

        test = 'sensor_'+sensor_id[n2]+'.csv'
        
        if test in source_code:

            split1 = source_code.split(test)[0]
            split2 = split1.split('<a href="')[-1]
            url_fin = url_ok + '/' + split2 + test
            print(url_fin)
            r2 = requests.get(url_fin)
            data = r2.text
            split = data.splitlines(True)
            #print(split)
            
            for n3 in range(1,len(split)):
                data_to_write += split[n3]
        
        time.sleep(0.1)
            

    f = open("/Users/PJ/Desktop/gressent/scrap_file_"+date+".csv", "a")
    f.write(data_to_write)
    f.close()

Here the python script for SDS011

1 Like

Thank you for this, I really appreciate the help