The Role at a Glance:
We are excited to bring on a AI/ML Engineering Intern to join our 2022 Summer cohort.
As an AI/ML Intern, you will be embedded in one of our high-performing agile data science and engineering teams. Our teams are autonomous and cross-functional, working on innovative solutions that support enterprise-wide initiatives. This internship will allow you to make direct contributions by applying machine learning, data engineering, and cloud-based AI tools to real-world problems. You will gain mentorship and hands-on experience in areas such as data preprocessing, model training and evaluation, natural language processing, and deployment of models on cloud platforms (AWS, GitLab).
What you’ll be doing:
- Collaborating with data scientists and engineers to design, train, and evaluate machine learning models (e.g., classification, regression, NLP, computer vision).
- Supporting the development of AI/ML pipelines, including data ingestion, preprocessing, feature engineering, and model deployment.
- Writing clean, efficient Python code to implement algorithms, perform data analysis, and automate workflows.
- Leveraging cloud platform (AWS) and tools such as GitLab for version control and CI/CD integration.
- Conducting exploratory data analysis (EDA) and visualizations to uncover insights and improve model performance.
- Documenting methodologies, experiments, and results to ensure reproducibility and knowledge sharing.
What we’re looking for:
Must-haves:
- College rising- junior or senior
- Must have availability to work 40 hours per week, Monday-Friday, between the dates of June 1, 2026 – August 7, 2026
- A minimum GPA of 3.0
- Exceptional communication skills
- Self-motivated and results-oriented
- Demonstrated critical thinking and problem-solving skills
- Authorization to work in the United States without sponsorship
Nice-to-haves:
- Hands-on experience with deep learning frameworks (TensorFlow, PyTorch, or Keras).
- Familiarity with NLP techniques (transformers, embeddings, large language models).
- Exposure to data visualization and BI tools (Tableau, Power BI, or matplotlib/seaborn).
- Knowledge of MLOps practices (model monitoring, observability, retraining pipelines).
- Understanding of database technologies (SQL and NoSQL) and data lake/warehouse environments.



