-
NUCES, FAST
- ISLAMABAD
-
17:22
(UTC -12:00) - aby_laiba
Pinned Loading
-
tashi-2004/Stock-Price-Forecasting-using-Financial-and-Twitter-Data
tashi-2004/Stock-Price-Forecasting-using-Financial-and-Twitter-Data PublicThis project integrates stock market analysis, tweet sentiment analysis, and stock price forecasting using ARIMA and GRU. It fetches data from MySQL, MongoDB, and streams via Kafka. YCSB benchmarks…
Jupyter Notebook 1
-
tashi-2004/Apache-Airflow-Kafka-Spark-DeltaLake-Real-Time-Stream-Pipeline
tashi-2004/Apache-Airflow-Kafka-Spark-DeltaLake-Real-Time-Stream-Pipeline PublicThis project implements a real-time data pipeline using Apache Airflow, Kafka, Apache Spark, and Delta Lake. It supports both batch (Coldpath) and real-time (Hotpath) data ingestion, processing, an…
Python
-
tashi-2004/FMA-A-Dataset-For-Music-Analysis
tashi-2004/FMA-A-Dataset-For-Music-Analysis PublicScripts for music feature analysis, model training, and real-time recommendation using Apache Kafka. Extract features, store them in MongoDB, and process the data with Apache Spark. A web interfac…
-
tashi-2004/Apache-Flink-Spark-Data-Streaming
tashi-2004/Apache-Flink-Spark-Data-Streaming PublicThis project showcases a real-time data streaming pipeline using Apache Flink, Apache Spark, and Grafana. It streams data, stores it in Parquet format, and performs aggregations for insights, with …
Python
-
tashi-2004/Apache-Kafka-and-Frequent-Item-sets
tashi-2004/Apache-Kafka-and-Frequent-Item-sets PublicThis Bash script automates the setup and execution of a data processing pipeline using Apache Kafka and Python scripts, ensuring fault tolerance and streamlined management of Kafka-based data pipel…
Python 3
-
tashi-2004/Apache-Hadoop-Spark-Hive-CyberAnalytics
tashi-2004/Apache-Hadoop-Spark-Hive-CyberAnalytics PublicThis project utilizes Apache Hadoop, Hive, and PySpark to process and analyze the UNSW-NB15 dataset, enabling advanced query analysis, machine learning modeling, and visualization. The project demo…
Jupyter Notebook
If the problem persists, check the GitHub status page or contact support.