python get high German poi

In recent projects, it is useful to call the Poi data of Gaode Maps, which mainly solves that Gaode Maps can return up to 1000 POI information per request. Here, taking the data of Chinese restaurants in Shenzhen as an example, the operation steps are organized.

1. Obtain Shenzhen's latitude and longitude

1. It is necessary to register an account on the open platform of AutoNavi Maps and apply for the key of the web service.
2. POI classification code and city code table
3. Query Shenzhen latitude and longitude: log in to AutoNavi Map Open Platform >> Development Support >> Web Service API>> Development Guide >> API Document >> Administrative area query, and then proceed according to the document description Operation, you can quickly query the boundary coordinates of the Shenzhen area directly in the example, or you can get it through a python script.
Insert picture description here
Insert picture description here
4. Since Shenzhen contains several plots, we can remove the small area data, the coordinate string of each plot in the obtained coordinate data is separated by |, copy the data to the world, and find the small area Coordinates are deleted, and then the remaining data is branched and copied to excel, and then the column operation is performed in excel, and the x and y headers are added and saved as an excel 97-2013xls file to facilitate the next arcmap operation.
Insert picture description here
Insert picture description here

2. Arcmp operation divides Shenzhen into several rectangular areas, and obtains the upper left and lower right coordinate points of each held

1. Download Arcmap, you can download it online
2. Open arcmap, click file >> add data >> add XYdata >> select file >> click + to select the xls file saved in the previous step >> click ok

Insert picture description here
Insert picture description here
Insert picture description here
Insert picture description here
Insert picture description here
3. Click export data >> save to folder as shp file >> click ok >> yse

Insert picture description here
Insert picture description here
Insert picture description here
4. Connect the points into a line: arctoolbox >> data management tools >> features >> double-click points to line >> In the pop-up dialog box, drag Export_Output_1 into the input box, and then select the output location >> ok

Insert picture description here
Insert picture description here
Remove the first layer √, you can get the boundary line of Shenzhen

Insert picture description here
5、绘制渔网:arctoolbox >> data management tools >> sampling >> create fishnet

Insert picture description here
Select the output area, divide it into 10 rows and 10 columns, and record the data of four points up, down, left, and right.

Insert picture description here
Insert picture description here

Insert picture description here
Fishing net is created
6. Edit the inner point of the fishnet: remove other layers, right-click the fishnet layer to open the attribute table, and click add field

Insert picture description here
Insert picture description here
Add lng_left, lat_top, lng_right, lat_bot, and set the type to float, and the precision can
Insert picture description here
Insert picture description here
be assigned to each point by default . Principle: Assign each midline point to the right boundary value.

Insert picture description here
Insert picture description here
Click Editor >> start editing to
Insert picture description here
select the first column, and assign the first column lng_left to: 113.7751453+0.0877013 0, the second column to 113.74553+0.0877013 1, the third column to 113.715453+0.0877013 2, and so on, the tenth Column assignment is 113.751453+0.0877013 9

Insert picture description here
Insert picture description here
Insert picture description here
Then select the first row, and assign the first row lat_top to: 22.861748-0.0423167 0 The second row to 22.861748-0.0423167 1, the third row to 22.861748-0.0423167 2,
and so on, the tenth row to 22.861748-0.0423167
9
Insert picture description here
Continue to assign values ​​to lng_right, the longitude on the right is equal to the longitude on the left plus the difference
[lng_left] +0.0877013, lat_bot is the same, = [lat_top] -0.0423167

Insert picture description here
Insert picture description here
After the assignment is complete, click Save Edit, and then click End Editing.
Insert picture description here
Then you need to delete the extra data points, open the originally created Shenzhen boundary layer, and delete the points outside the boundary. To delete a point, you also need to click Editor >> start editing
Insert picture description here
Insert picture description here
to select and delete the points outside the area.
Insert picture description here
Open the attribute table, export the data, and select the export text file. The exported file is the location coordinates of the Shenzhen area divided into small matrices

Insert picture description here
Insert picture description here
Insert picture description here

Three, python processing data

import pandas as pd
import requests

key = '***********************'   #高德地图开发者平台申请的key
keywords = "中餐厅"
types = "050100"
url_head = "https://restapi.amap.com/v3/place/polygon?"
text1 = pd.read_csv(r"C:\Users\37957\Desktop\python获取高德poi\Export_Output2.txt")  #从arcmap中导出的text文件地址
#for i in range(0, len(text1)):  #获取全部矩阵数据时,有可能超过高德地图配置的每天30000条数据限额,可选择先获取前35条,第二天再获取剩余数据
for i in range(0, 35):
	#经度和纬度用","分割,经度在前,纬度在后,坐标对用"|"分割,经纬度小数点后不得超过6位。
    con_location = str(round(text1["lng_left"][i], 6)) + ',' + str(round(text1['lat_top'][i], 6)) + '|' + str(round(
        text1['lng_right'][i], 6)) + ',' + str(round(text1['lat_bot'][i], 6))
    url_ttl = url_head + "polygon=" + con_location + "&keywords=" + keywords + "&key=" + key + "&types=" + types + "&output=json&offset=20&extensions=base"
    # rep0 = requests.get(url_ttl)
    # json0 = rep0.json()
    # count0 = json0["count"]
    print("开始进行第%d个网格的数据提取." % i)
    # print("一共%d条数据" % int(count0))
    for p in range(1, 101):  #遍历100页
        url = url_ttl + '&page=' + str(p)
        rep1 = requests.get(url)
        json1 = rep1.json()
        pois_data = json1['pois']
        for poi in pois_data:
            poi_name = str(poi["name"])
            poi_adname = str(poi["adname"])
            poi_address = str(poi.get("address", "无详细地址"))
            poi_location = str(poi["location"])
            poi_tel = str(poi.get("tel", "无联系方式"))
            poi_info = poi_name + ";" + poi_adname + ";" + poi_address + ";" + poi_location + ";" + poi_tel + "\n"
            poi_info = poi_info.encode('gbk', 'ignore').decode("gbk", "ignore")    #避免编码错误
            with open("poi_gaode.txt", 'a') as f:
                f.write(poi_info)
            print("正在写入数据...")
print("数据提取完成")

After the data extraction is completed, the poi_gaode.txt file generated in the current directory is the data we need.

Guess you like

Origin blog.csdn.net/weixin_47796965/article/details/108372378