CODE
CODE

Learn by Building Real Industry Systems

AiCore Scenarios let you build end-to-end Ai & Data systems deployed at Meta, Airbnb and other leading tech companies.

You have been added to the waitlist
Oops! Something went wrong while submitting the form.

What are AiCore Scenarios?

Scenarios put you in the position of an engineer on the job. You are dropped into cloud infrastructure that mirrors what you’d find in the workplace. You are challenged to use step by step instructions to build their data pipelines and models, learning by doing.

We are proud to be the first to pioneer scenarios as a way of learning.

Launch the pre-configured environment

Don’t waste hours trying to configure and version control software. Build in virtual environments that come pre-installed with all the software you need so you can start learning instantly.

Follow step-by-step instructions and schematics

Each industry system is accompanied with full schematics and in-depth step by step instructions that you can follow to build it.

Learn through our content library

Stuck? Use our comprehensive content library with 100s of videos. We cover everything from the basics of Python to advanced AWS cloud engineering.

Four career paths. Four scenarios.

Pinterest's Experimentation Data Engineering Pipeline

Become a Data Engineer

Pinterest has world-class machine learning engineering systems. They have billions of user interactions such as image uploads or image clicks which they need to process every day to inform the decisions to make. In this project, you’ll build the system in the cloud that takes in those events and runs them through two separate pipelines. One for computing real-time metrics (like profile popularity, which would be used to recommend that profile in real-time), and another for computing metrics that depend on historical data (such as the most popular category this year).

Tools used:

AWS (S3, RDS)
Spark (Streaming, Core)
Kafka
Airflow
Prometheus & Grafana
Docker
FastAPI
Python
Git & Github
Become a data engineer

Migrating Skyscanner's Data to Tableau

Become a Data Analyst

Take on the role of a Data Analyst at Skyscanner and work on a project requirement to migrate existing data analytics tasks from an Excel-based manual system into interactive Tableau reports. This is a widely experienced scenario, which will  prepare you for the job before you land it, and get you hands on practice with the key tools that data analysts need. It will require you to have an understanding of how data is stored and analysed using traditional and cutting edge tools, and will challenge you to implement intuitive visualisations of a complex, real-world dataset. At the end of the day, you’ll have built an analytics engine which could empower hundreds of teams across a company to provide more value for their business.

Tools used:

Tableau
SQL
Python (data visualisation, data processing)
Databases
Become a data analyst

Airbnb’s Multimodal Property Intelligence System

Become a Data Scientist

Airbnb deploys a multimodal neural network AI system to understand listings that are uploaded by hosts. Multimodal means that it processes various forms (modalities) of data including images and text descriptions. At Airbnb this system is used to improve search rankings, predict complaints, and detect fraud. It’s a challenging problem which will require you to do advanced data manipulation and feature engineering. One of the primary challenges is building this system in a way that has the potential to scale up to train on a large dataset. Once you’ve developed it, you’ll deploy it to serve predictions through an API.

Tools used:

AWS (RDS, S3)
PyTorch (Core, TorchText, TorchVision)
Pandas
SkLearn
Tensorboard
Python
Git & Github
Become a data scientist

Facebook's Marketplace Product Ranking Algorithm

Become a machine learning engineer

Use transfer learning on a pre-trained ResNet-50 neural network to build computer vision and NLP models that process images and descriptions then combine into the multimodal transformer architecture Facebook uses to rank search results on the marketplace. Deploy those models at the scale of Faceook using Kubernetes.

Tools used:

AWS (RDS, S3, EKS)
PyTorch (Core, TorchText, TorchVision)
FAISS
Kubernetes, Kubeflow
Docker
FastAPI
Python
Git & Github
Become a machine learning engineer

Ready to become an Ai & Data professional?

Apply Now