An awesome application to find the better you!
Explore the docs »
·
Report Bug
·
Request Feature
Table of Contents
This repository contains a Pokemon application designed to track and compare the price differences between the Japan and US markets for Pokemon cards. This tool is especially useful for collectors and resellers looking to take advantage of price discrepancies across markets.
- Track and compare Pokemon card prices between the Japan and US markets.
- Identify profitable opportunities for buying and selling cards across markets.
- Fetch real-time card prices from trusted sources.
- Keep track of price trends and fluctuations over time.
Here are major frameworks/libraries we used to build our project.
-
Navigate to the
backendfoldercd backendThen
pip install -r requirements.txt
to install all dependencies
-
Next run the following code to set up the virtual environment for backend
-
for window:
python -m venv venv
-
for mac
python3 -m venv venv
-
Then run
python app.py
-
-
Navigate to the
frontendfoldercd frontendThen run
npm i --force
to install all dependencies.
-
Run the following code to start the frontend
npm run start
-
You should be directed to
http://localhost:3000/
-
One of our key strengths is our ability to consolidate information from disparate sources and derive meaningful insights. In our code, we integrate information from multiple websites, across two different languages, and across various types of data including card pricing, exchange rates, and shipping information to identify arbitrage opportunities. Each section of the code has a moderate scope and complexity but combined together provides value for our users.
-
Our code is a scalable and efficient solution for collecting and monitoring pricing and card information for Pokemon cards from multiple pages of Japanese and US trading websites. We utilize Beautiful Soup and regular expressions to accurately extract relevant information, ensuring consistent and reliable data. The structure of our code is clear and we can easily scale it to more regions, languages, and currencies.
-
Our code allows users to compare shipping prices from three different shipping companies: Japan Post, DHL, and FedEx. By entering the required information, such as the number of cards, US zip code, and Japan zip code, users can easily view shipping prices and options from each company. Please note that due to the use of Selenium scripts, fetching the results may take approximately 15-30 seconds per shipping company. Link
-
Our code allows users to easily convert currencies between USD and JPY. You can always click on the money icon to access the currency conversion tool via a pop-up window. Link
-
Our code effectively extracts US card names and prices for Pokémon cards using BeautifulSoup and Selenium. This is a significant achievement, considering the complexities involved in dealing with different languages, text formats, and website structures. By using both Selenium and Beautiful Soup, we have ensured a robust and flexible approach to data extraction that can handle various challenges and adapt to future changes in website structures. Link Link
-
We successfully identified 1,000 cards with valid US prices out of 5,000 cards with JP names and JP prices. This high extraction rate speaks to the quality and effectiveness of our code, especially given the challenges of dealing with differences in text, languages, and cultures between the two countries. Link Link
-
Despite time constraints, we have developed a fully functional frontend interface that is both user-friendly and visually reasonable. By leveraging multiple libraries and technologies, we have created a decent experience for users to navigate with our websites and learn the price differences between JP and US cards. Link
-
We have incorporated a cart feature that allows users to compare the total price of their selected cards in both US dollars and Japanese yen. This feature allows users to make informed decisions about their purchases by offering a clear and straightforward way to assess potential cost savings or earnings. Link
-
When scraping Japanese and US card prices, the solution is customized to the particular card trading websites. To extract data from more websites, we will need to build customizable scarping solutions for those websites as well and the marginal return is low. However, since most of the trading websites follow a similar structure in presenting information (name, id, and price), a possible improvement is to use machine learning techniques and automatically find the prices instead of customizing our code for each website. The complexity of the solution might be high since we need to make a universal solution for different websites but we could greatly increase the reliability of the prices since it comes from more sources.
-
When attempting to obtain shipping prices from FedEx, USPS, UPS, and DHL, we encountered difficulties using their APIs directly. To access the APIs for FedEx, USPS, and DHL, a business API key is required, which necessitates having an actual company and applying for it. As for UPS, their API only permits quoting shipping prices from the US to Japan, rather than from Japan to the US. Consequently, we decided to utilize Selenium scripts to perform GUI interactions directly on the respective websites. Link
- This approach results in longer delays, taking approximately 15-30 seconds per shipping company.
- The Selenium script may occasionally time out, requiring users to rerun the function on the webpage.
- When running Selenium, a pop-up window appears. Although I attempted to use "headless" mode to mitigate this issue, it increased the likelihood of encountering bugs.
-
As our website is not yet deployed, we are unable to implement automated crawling functionality that would allow for regular updates, such as hourly or daily refreshes of card prices and information. This limitation impacts the real-time accuracy of our data. After the finals, we plan to deploy the website and enhance the crawling functionality for more up-to-date information.
-
Currently, our project focuses on a select number of websites that sell Pokémon cards. However, not all cards are available on these sites, which may limit the comprehensiveness of our platform. In the future, we plan to incorporate more websites into our data collection process to improve our market coverage and offer a more comprehensive service to our users.
-
Our current implementation relies on Beautiful Soup to scrape card information, which requires making an HTTP request for each card individually. This approach can be slow and resource-intensive, especially when dealing with a large number of cards. We are exploring alternative methods for data collection that can improve the efficiency of our scraping process without compromising the quality of our data.
-
Our project relies on the accuracy and availability of data from third-party websites, which may change their structure, policies, or availability at any time. This dependency presents a risk to the stability of our platform, and we will need to continually monitor and adapt to changes in these sources to maintain the accuracy and reliability of our data.
-
Home page
-
Search page
-
Shopping cart
-
Volatile card
-
Currency page
-
Shipping page
Contributions are what make the development community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork this repo and create a pull request. Please feel free to reach out to one of the Team DoIt Members below via email to contact us, too! We'd love to talk.
Don't forget to give the project a star! ⭐ Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b) - Commit your Changes (
git commit -m 'RandomMessage') - Push to the Branch (
git push origin) - Open a Pull Request
- Yujian (Ken) He - @Kennnnn774 - yhe99@jhu.edu
- Shaopeng Zeng - @SP-Zeng - szeng10@jhu.edu
- Ian Zheng - @ianzhg - yzheng67@jhu.edu
We are thankful for these resources which have helped us on our development journey:






