The world’s most industry informed, hands-on course in Ai & Data
An immersive programme that will help you discover your place in the future of Ai & Data, and launch your career at supersonic speed.
Learn by building and deploying production-grade systems, within a thriving community of industry experts.
Get experience building real industry systems through industry projects

Our industry projects put you in the position of an engineer on the job. You are dropped into cloud infrastructure that mirrors what you’d find in the workplace. You are challenged to use step by step instructions to build their data pipelines and models, learning by doing.
Four career paths. Four Specialisms
Build a solid foundation in software engineering
Software Engineering & Data Manipulation Essentials
Learn the core of writing production ready code, following industry best practices.
Build a complete data solution for a multinational organisation, from data acquisition to analysis . Write Python code to extract large datasets from multiple data sources. Utilise the power of Pandas to clean and analyse the data. Build a STAR based database schema for optimised data storage and access. Perform complex SQL data queries to extract valuable insights and make informed decisions for the organisation.
Module 1: The command line
- Navigation
- Essential Commands and Syntax
Module 2: Git and Github
- Essential Commands and Syntax
- Version Control
- Branching
- Pull requests
Module 3: Python Programming
- The Python environment
- Debugging
- Arithmetic Variable Assignment and Strings
- Lists and Sets
- Dictionaries, Tuples and Operators
- Control Flow
- Loops
- Functions
- Object Oriented Programming
- Advanced Python
- Error Handling
Module 4: Data Formats and Processing Libraries
- JSON, CSV, XLSX and YAML
- Tabular Data
- Intro to Pandas
- Pandas Dataframes
- Data Cleaning in Pandas
- Pandas Advance Dataframe Operations
Module 5: APIs & Webscraping
- APIs and Requests
Module 6: Software design
- Principles of OOP Design
- Inheritance, Polymorphism, Abstraction, Encapsulation
- Class Decorators
- Docstring and Typing
- Project Structure
Module 7: SQL
- What is SQL?
- SQL Setup
- SQL Tools Setup
- SQL Commands
- SQL best practices
- SELECT and Sorting
- The WHERE Clause
- CRUD Creating Tables
- CRUD Altering Tables
- SQL JOINs
- SQL JOIN Types
- SQL Common Aggregations
- SQL GROUP BY
- Creating Subqueries
- Types of Subqueries
- CRUD Subquery Operations
- Common Table Expressions (CTEs)
- pyscopg2 and SQLAlchemy
Module 8: Essential Cloud Technology
- AWS Essentials
- The AWS CLI and Python SDK (boto3)
- Data Lake Storage on AWS S3
- AWS RDS for Data Warehouse Storage
Module 9: CI-CD
- Github Actions
Then choose your specialist career path
Data Engineering
Learn how to store, share and process various types of data at scale.
Build Pinterest's experiment analytics data pipeline which runs thousands of experiments per day and crunches billions of datapoints to provide valuable insights to improve the product.
Module 1: Big data engineering foundations
- Introduction
- The Data Engineering Landscape and Lifecycle
- Data Pipelines
- Data Ingestion and Data Storage
- Enterprise Data Warehouses
- Batch vs Real-Time Processing
- Structured, Unstructured and Complex Data
Module 2: Data ingestion
- Principles of Data Ingestion
- Batch Processing
- Real-Time Data Processing
- APIs and Requests
- FastAPI
- Kafka Essentials
- Kafka-Python
- Streaming in Kafka
Module 3: Data wrangling and transformation
- Data Transformations: ELT & ETL
- Apache Spark and Pyspark
- Distributed Processing with Spark
- Integrating Spark & Kafka
- Integrating Spark & AWS S3
- Spark Streaming
Module 4: Data storage
- AWS Essentials
- The AWS CLI and Python SDK (boto3)
- Data Lake Storage on AWS S3
- Psycopg2 and SQLAlchemy
- PgAdmin4
Module 5: Data orchestration
- Apache Airflow
- Integrating Airflow & Spark
Module 6: Advanced Cloud Technologies and Databricks
- MSK Essentials and Connect
- AWS API Gateway
- Integrating API Gateway with Kafka
- Databricks Essentials
- Integrating Databricks with Amazon S3
- AWS MWAA
- Orchestrating Databricks Workloads on MWAA
- AWS Kinesis
- Integrating Databricks with AWS Kinesis
- Integrating API Gateway with Kinesis
Data Analytics
Learn how to discover and analyse raw data to derive useful patterns, trends, relationships and insights, and communicate these in a visual manner to enhance decision making.
Take on the role of a Data Analyst at Skyscanner and work on a project requirement to migrate existing data analytics tasks from an Excel-based manual system into interactive Tableau reports.
Module 1: Data wrangling and cleaning
- Data loading
- Data cleaning
- Data integration
- Data exporting
Module 2: PostgreSQL RDS Data Import and Reporting
- Connecting to pgAdming4
- Creating databases and tables
- Importing data
- Data exploration and statistical analysis
Module 3: Integrate Tableau Desktop with PostgreSQL RDS
- Setting up Tableau Desktop
- Configuring PostgreSQL connector
- Connecting to databases
Module 4: Create Tableau Reports
- Tableau data exploration
- Data analysis and visualisation
- Creating reports
Data Science
Learn to visualise, preprocess and model data with statistical tools and machine learning algorithms.
Model Airbnb’s property listing dataset. Build a framework that systematically train, tune, and evaluate models on several tasks that are tackled by the Airbnb team
Module 1: Data Cleaning and Exploratory Data Analysis
- Data Visualisation
- Multicollinearity
- Influential points - Leverages and Outliers
Module 2: Introduction to machine learning
- Data for ML
- Intro to models - Linear Regression
- Validation and Testing
- Gradient Based Optimisation
- Bias and Variance
- Hyperparameters, Grid Search and K-Fold Cross Validation
Module 3: Classification
- Binary Classification
- Multiclass Classification
- Multilabel Classification
Module 4: Theory
- Maximum Likelihood Estimation
- Evaluation Metrics
Module 5: Popular Supervised Models
- K-Nearest Neighbours
- Classification Trees
- Support Vector Machines
- Regression Trees
Module 6: Ensembles
- Ensembles
- Random Forests and Bagging
- Boosting and Adaboost
- Gradient Boosting
- XGBoost
Module 7: Neural Networks
- Neural networks
- Dropout
- Batch Normalisation
- Optimisation for deep learning
- Convolutional Neural Networks (CNNs)
- ResNets
Machine Learning Engineering
Learn when and where machine learning models, including neural networks, are used within systems and how they are deployed.
Build Facebook Marketplace’s recommendation ranking system. Facebook Marketplace is a platform for buying and selling products on Facebook. This is an implementation of the system behind the marketplace, which uses AI to recommend the most relevant listings based on a personalised search query.
Module 1: Introduction to machine learning
- Data for ML
- Intro to models - Linear Regression
- Validation and Testing
- Gradient Based Optimisation
- Bias and Variance
- Hyperparameters, Grid Search and K-Fold Cross Validation
Module 2: Classification
- Binary Classification
- Multiclass Classification
- Multilabel Classification
Module 3: PyTorch
- Automatic differentiation
- PyTorch Datasets and DataLoaders
- Making custom datasets
Module 4: Neural Networks
- Neural networks
- DropoutBatch Normalisation
- Optimisation for deep learning
- Convolutional Neural Networks (CNNs)
- ResNets
Module 5: Practical
- Architecture, data augmentation & debugging tips
- Pre-trained models
- Transfer learning
- Hardware acceleration (GPUs & TPUs)
Module 6: Applications
- Churn Modelling
- FAISS Vector Search
- Image Based Search
Module 7: Building APIs
- Intro to FastAPI
- Deploying FastAPI
- Efficient FastAPI
Career support
Work with our outcomes team to launch your new career.
Programme Schedule
The programme is fully remote. There are no traditional “classes” to attend. You can progress through your learning and projects on whatever schedule is convenient to you, booking time with support engineers to guide you as you need it.
Drop-in live support sessions available through the day and evening
Online community meetups Monday - Thursday 6:30PM to 9:30PM where you can work alongside your peers and support engineers are available for instant support
Drop-in live support sessions available through the day and evening
Online community meetups Monday - Thursday 6:30PM to 9:30PM where you can work alongside your peers and support engineers are available for instant support
Drop-in live support sessions available through the day and evening
Online community meetups Monday - Thursday 6:30PM to 9:30PM where you can work alongside your peers and support engineers are available for instant support
Launch your career with AiCore support
Career playbook
Have your CV, LinkedIn and Github portfolio optimized. Learn how to source your ideal roles.
Get referred by alumni
Our alumni network hire directly from AiCore. Over 15% of AiCore grads get hired this way.
Interview coaching
Feel 100% confident going into any hiring process. Our team will prepare you with general and technical mock interviews.
Curated job board
Access our internal job board of curated roles
Success stories
Learning packages that work for you
Professional certification
Get the skills and experience you need to become a qualified data analyst, data scientist, data engineer or machine learning engineer
Career launch
The end-to-end solution for launching your career as a data analyst, data scientist, data engineer or machine learning engineer
Frequently Asked Questions
Who are we?
AiCore is a specialist Ai & Data career accelerator. We are delivering an immersive programme that will launch your career in Ai & Data at supersonic speed.
AiCore was founded by Harry Berg, Christian Kerr and Haron Shams.
Harry and Haron originally founded the Machine Learning Society at Imperial College London, and Christian was Chief of Staff at CogX, Europe's largest festival on AI.
They met at an Ai Conference in London, and they came together to create AiCore.
Over the next year, they taught a community of over 5500 Ai & Data enthusiasts and using industry feedback, they started developing the AiCore programme.
AiCore is now a team of over 20 people working to deliver the world's most industry-informed hands-on education in Ai & Data.
Where will I take classes?
We don't have classes in the traditional sense. All the learning you do at AiCore is directed towards completing industry projects. You consume material on your own. Then you come to meetups every night to work on your projects along side your peer group and there are support engineers to offer you instant live support.
All of this takes place exclusively online, so you can learn from the comfort of your own home. On top of the meetups, we also hold public events with industry experts every week, and private events including industry mentor office hours and open hacking for students only.
How will AiCore help me land a job?
After completing the essentials part of the programme, you take career preparation alongside your specialist pathway. This includes holding soft skills workshops as part of the programme, giving you access to our internal job marketplace of curated and exclusive roles, holding mock interview sessions, keeping you accountable for weekly progress with a checklist during the course and auditing your CV and GitHub and LinkedIn accounts to make sure they are shining showcases of your skills.
How do I secure a place on the course?
To ensure all AI Core students get the individual attention and resources they need to succeed, course sizes are limited. After submitting your applications, you’ll connect with the admissions team, who will determine whether or not the course is a good fit for your experience and goals. Additionally, you’ll complete an admissions assessment to make sure you’re prepared for the rigour of the curriculum. Once you’ve been accepted and set up your learning package, your spot in the course is secure. Connect with our admissions team now for more details.
What do I need to do to graduate?
To graduate you need to complete the projects and quizzes.
Aside from class, how much time am I expected to spend studying or working?
Classes consist of 3 hour sessions, 4 days per week. That's a total of 12 hours. On top of this, you should aim to spend 8 hours outside of class reviewing content, completing challenges, assessments and projects. That's a total of 20 hours, which makes it a part time programme.