Skip to content

Commit 33c2065

Browse files
author
tnier01
committed
2 parents bd83f16 + debd905 commit 33c2065

2 files changed

Lines changed: 119 additions & 16 deletions

File tree

README.md

Lines changed: 32 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -10,33 +10,38 @@ This script is used to send request to four different computer vision APIs and s
1010
* [iNaturalist](https://www.inaturalist.org/pages/computer_vision_demo): The iNaturalist API is developed by [iNaturalist](https://www.inaturalist.org/), a joint initiative by the California Academy of Science and the National Geographic Society. It is able to identify plant and animal species.
1111
* [NIA](https://observation.org/apps/obsidentify/): The Nature Identification API, is a joint effort by [Observation International](https://observation-international.org/en/), [Naturalis](https://www.naturalis.nl/en) and Intel Corp. It is able to identify plant and animal species.
1212

13-
## Prerequisites
1413

15-
1. Install Python
1614

17-
2. Install Jupyter notebook (more info at https://jupyter.org/install.html)
15+
## Installation
16+
### 1. Clone the repo
1817
```bash
19-
pip install notebook
18+
git clone https://github.com/EibSReM/RequestCollectionComputerVisionAPIs.git
19+
cd ReqRequestCollectionComputerVisionAPIs
2020
```
21-
3. Install needed packages
2221

22+
### 2. Installtion of packages
23+
24+
You can either install python and all packages directly on your or machine or you can use virtual environments like conda.
25+
26+
27+
a) Plain installation
28+
29+
1. Install Python
30+
31+
2. Install Jupyter notebook (more info at https://jupyter.org/install.html) and needed Packages
2332
```bash
24-
pip install base64
33+
pip install notebook
2534
pip install requests
26-
pip install getpass
27-
pip install csv
28-
pip install os
29-
pip install json
30-
pip install pprint
3135
```
3236

33-
## Installation
34-
1. Clone the repo
37+
b) Using Conda
38+
3539
```bash
36-
git clone https://github.com/EibSReM/RequestCollectionComputerVisionAPIs.git
40+
conda env create -f environment.yml
41+
conda activate RequestCollectionComputerVisionAPIs
3742
```
3843

39-
2. Start the jupyter notebook
44+
### 3. Start the jupyter notebook
4045
```bash
4146
jupyter notebook apiRequests.ipynb
4247
```
@@ -72,6 +77,17 @@ For all APIs you need to provide user credentials. How you can get these is desc
7277
* Create an Account on Obervation.org: https://observation.org/accounts/signup/
7378
* Write an E-Mail to observation.org and ask them to enable your account to access the computer vision model: info@observation.org
7479

80+
## Number of Requests
81+
The amount of requests needed depends on the number of images to be tested. For each image one request is needed. In our research we tested the APIs with 6 images of by the API covered Invasive Alien Species of union concern. Which results in different test sizes.
82+
* INaturalist API: 432
83+
* NIA: 294
84+
* Pl@ntNet API: 180
85+
* PlantID: 186
86+
87+
## Execution Time
88+
The execution time strongly depends on the internet connection and the response time of the APIs, which often vary. As an example:
89+
To query 150 images with Pl@ntNet took 8:13 minutes, with the CPU time being 1:43 minutes.
90+
7591
## Approach Demo
92+
In the folder [demo](/demo/) you can find a couple of images and a README with the expected results for demonstration purposes.
7693

77-
* In the folder [demo](/demo/) you can find a couple of images and a README with the expected results for demonstration purposes.

environment.yml

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
name: RequestCollectionComputerVisionAPIs
2+
channels:
3+
- conda-forge
4+
- defaults
5+
dependencies:
6+
- bzip2=1.0.8=h8ffe710_4
7+
- ca-certificates=2022.5.18.1=h5b45459_0
8+
- libffi=3.4.2=h8ffe710_5
9+
- libzlib=1.2.11=h8ffe710_1014
10+
- openssl=3.0.3=h8ffe710_0
11+
- pip=22.1.1=pyhd8ed1ab_0
12+
- python=3.10.4=hcf16a7b_0_cpython
13+
- python_abi=3.10=2_cp310
14+
- setuptools=62.3.2=py310h5588dad_0
15+
- sqlite=3.38.5=h8ffe710_0
16+
- tk=8.6.12=h8ffe710_0
17+
- tzdata=2022a=h191b570_0
18+
- ucrt=10.0.20348.0=h57928b3_0
19+
- vc=14.2=hb210afc_6
20+
- vs2015_runtime=14.29.30037=h902a5da_6
21+
- wheel=0.37.1=pyhd8ed1ab_0
22+
- xz=5.2.5=h62dcd97_1
23+
- pip:
24+
- argon2-cffi==21.3.0
25+
- argon2-cffi-bindings==21.2.0
26+
- asttokens==2.0.5
27+
- attrs==21.4.0
28+
- backcall==0.2.0
29+
- beautifulsoup4==4.11.1
30+
- bleach==5.0.0
31+
- certifi==2022.5.18.1
32+
- cffi==1.15.0
33+
- charset-normalizer==2.0.12
34+
- colorama==0.4.4
35+
- debugpy==1.6.0
36+
- decorator==5.1.1
37+
- defusedxml==0.7.1
38+
- entrypoints==0.4
39+
- executing==0.8.3
40+
- fastjsonschema==2.15.3
41+
- idna==3.3
42+
- ipykernel==6.13.0
43+
- ipython==8.3.0
44+
- ipython-genutils==0.2.0
45+
- jedi==0.18.1
46+
- jinja2==3.1.2
47+
- jsonschema==4.5.1
48+
- jupyter-client==7.3.1
49+
- jupyter-core==4.10.0
50+
- jupyterlab-pygments==0.2.2
51+
- markupsafe==2.1.1
52+
- matplotlib-inline==0.1.3
53+
- mistune==0.8.4
54+
- nbclient==0.6.3
55+
- nbconvert==6.5.0
56+
- nbformat==5.4.0
57+
- nest-asyncio==1.5.5
58+
- notebook==6.4.11
59+
- packaging==21.3
60+
- pandocfilters==1.5.0
61+
- parso==0.8.3
62+
- pickleshare==0.7.5
63+
- prometheus-client==0.14.1
64+
- prompt-toolkit==3.0.29
65+
- psutil==5.9.1
66+
- pure-eval==0.2.2
67+
- pycparser==2.21
68+
- pygments==2.12.0
69+
- pyparsing==3.0.9
70+
- pyrsistent==0.18.1
71+
- python-dateutil==2.8.2
72+
- pywin32==304
73+
- pywinpty==2.0.5
74+
- pyzmq==23.0.0
75+
- requests==2.27.1
76+
- send2trash==1.8.0
77+
- six==1.16.0
78+
- soupsieve==2.3.2.post1
79+
- stack-data==0.2.0
80+
- terminado==0.15.0
81+
- tinycss2==1.1.1
82+
- tornado==6.1
83+
- traitlets==5.2.1.post0
84+
- urllib3==1.26.9
85+
- wcwidth==0.2.5
86+
- webencodings==0.5.1
87+
prefix:

0 commit comments

Comments
 (0)