Skip to content

Commit 5ee1a75

Browse files
committed
add materials
1 parent d482c22 commit 5ee1a75

16 files changed

+2929
-2
lines changed

README.md

Lines changed: 100 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,100 @@
1-
# Python-Web-APIs
2-
D-Lab's 2 hour introduction to web scraping in Python. Learn how to scrape data from websites using BeautifulSoup in Python.
1+
# D-Lab's Python Web APIs Workshop
2+
3+
[![Datahub](https://img.shields.io/badge/launch-datahub-blue)](LINK)
4+
[![Binder](http://mybinder.org/badge.svg)](LINK)
5+
6+
This repository contains the materials for D-Lab’s Python Web APIs Workshop. Prior experience with Python Fundamentals and Python Data Wrangling is assumed.
7+
8+
## Workshop Goals
9+
10+
In this workshop, we cover how to extract data from the web with APIs using Python.
11+
APIs are often official services offered by companies and other entities, which
12+
allow you to directly query their servers in order to retrieve their data. Platforms like The New York Times, Twitter and Reddit offer APIs to retrieve data.
13+
14+
When APIs are not available, one can turn to web scraping. If you want to learn how to do web scraping in Python, see D-Lab's [Python Web Scraping](https://github.com/dlab-berkeley/Python-Web-Scraping) Workshop.
15+
16+
Basic familiarity with Python is assumed. Understanding the material in the [Python Fundamentals](https://github.com/dlab-berkeley/Python-Fundamentals) and [Python Data Wrangling](https://github.com/dlab-berkeley/Python-Data-Wrangling) workshops highly recommended. We additionally recommend a basic understanding of HTML and CSS.
17+
18+
## Installation Instructions
19+
20+
Anaconda is a useful package management software that allows you to run Python
21+
and Jupyter notebooks very easily. Installing Anaconda is the easiest way to
22+
make sure you have all the necessary software to run the materials for this
23+
workshop. Complete the following steps:
24+
25+
1. [Download and install Anaconda (Python 3.8
26+
distribution)](https://www.anaconda.com/products/individual). Click
27+
"Download" and then click 64-bit "Graphical Installer" for your current
28+
operating system.
29+
30+
2. Download the [Python-Web-Scraping workshop
31+
materials](https://github.com/dlab-berkeley/Python-Web-Scraping):
32+
33+
* Click the green "Code" button in the top right of the repository information.
34+
* Click "Download Zip".
35+
* Extract this file to a folder on your computer where you can easily access it
36+
(we recommend Desktop).
37+
38+
3. Optional: if you're familiar with `git`, you can instead clone this
39+
repository by opening a terminal and entering `git clone
40+
git@github.com:dlab-berkeley/Python-Web-APIs.git`.
41+
42+
## Is Python Not Working on Your Computer?
43+
44+
If you do not have Anaconda installed and the materials loaded on your workshop by the time it starts, we *strongly* recommend using the UC Berkeley Datahub to run the materials for these lessons. You can access the DataHub by clicking this button:
45+
46+
[![Datahub](https://img.shields.io/badge/launch-datahub-blue)](LINK)
47+
48+
The DataHub downloads this repository, along with any necessary packages, and allows you to run the materials in a Jupyter notebook that is stored on UC Berkeley's servers. No installation is necessary from your end - you only need an internet browser and a CalNet ID to log in. By using the DataHub, you can save your work and come back to it at any time. When you want to return to your saved work, just go straight to [DataHub](https://datahub.berkeley.edu), sign in, and you click on the `Python-Web-Scraping` folder.
49+
50+
If you don't have a Berkeley CalNet ID, you can still run these lessons in the cloud, by clicking this button:
51+
52+
[![Binder](http://mybinder.org/badge.svg)](LINK)
53+
54+
By using this button, however, you cannot save your work.
55+
56+
## Run the code
57+
58+
1. Open the Anaconda Navigator application. You should see the green snake logo appear on your screen. Note that this can take a few minutes to load up the first time.
59+
60+
2. Click the "Launch" button under "Jupyter Notebooks" and navigate through your file system to the `Python-Web-Scraping` folder you downloaded above.
61+
62+
3. Open the `lessons` folder, and click `01_api_lesson.md` to begin.
63+
64+
4. Press Shift + Enter (or Ctrl + Enter) to run a cell.
65+
66+
5. By default, the necessary packages for this workshop should already be installed. You can install them within the Jupyter notebook by running the following line in its own cell:
67+
68+
> ```!pip install -r requirements.txt```
69+
70+
Note that all of the above steps can be run from the terminal, if you're familiar with how to interact with Anaconda in that fashion. However, using Anaconda Navigator is the easiest way to get started if this is your first time working with Anaconda.
71+
72+
# About the UC Berkeley D-Lab
73+
74+
D-Lab works with Berkeley faculty, research staff, and students to advance data-intensive social science and humanities research. Our goal at D-Lab is to provide practical training, staff support, resources, and space to enable you to use R for your own research applications. Our services cater to all skill levels and no programming, statistical, or computer science backgrounds are necessary. We offer these services in the form of workshops, one-to-one consulting, and working groups that cover a variety of research topics, digital tools, and programming languages.
75+
76+
Visit the [D-Lab homepage](https://dlab.berkeley.edu/) to learn more about us. You can view our [calendar](https://dlab.berkeley.edu/events/calendar) for upcoming events, learn about how to utilize our [consulting](https://dlab.berkeley.edu/consulting) and [data](https://dlab.berkeley.edu/data) services, and check out upcoming [workshops](https://dlab.berkeley.edu/events/workshops).
77+
78+
# Other D-Lab Python Workshops
79+
80+
Here are other Python workshops offered by the D-Lab:
81+
82+
## Basic competency
83+
84+
* [Python Fundamentals](https://github.com/dlab-berkeley/python-fundamentals)
85+
* [Introduction to Pandas](https://github.com/dlab-berkeley/introduction-to-pandas)
86+
* [Geospatial Fundamentals in Python](https://github.com/dlab-berkeley/Geospatial-Fundamentals-in-Python)
87+
* [Python Visualization](https://github.com/dlab-berkeley/Python-Data-Visualization)
88+
89+
## Intermediate/advanced competency
90+
91+
* [Computational Text Analysis in Python](https://github.com/dlab-berkeley/computational-text-analysis-spring-2019)
92+
* [Introduction to Machine Learning in Python](https://github.com/dlab-berkeley/python-machine-learning)
93+
* [Introduction to Artificial Neural Networks in Python](https://github.com/dlab-berkeley/ANN-Fundamentals)
94+
* [Fairness and Bias in Machine Learning](https://github.com/dlab-berkeley/fairML)
95+
96+
# Contributors
97+
98+
* [Rochelle Terman](https://github.com/rochelleterman)
99+
* [George McIntire](https://github.com/GeorgeMcIntire)
100+
* [Pratik Sachdeva](https://github.com/pssachdeva)

data/election2020_articles.pkl

2.62 MB
Binary file not shown.

images/ellington.jpg

88.9 KB
Loading

images/google_link.png

716 KB
Loading

images/google_link_change.png

1.48 MB
Loading

images/google_search.png

322 KB
Loading

images/nytimes_app.png

118 KB
Loading

images/nytimes_docs.png

491 KB
Loading

images/nytimes_key.png

57.3 KB
Loading

images/nytimes_start.png

106 KB
Loading

0 commit comments

Comments
 (0)