🚨 Black Friday Sale 🚨 Purchase a course through an individual or team subscription and get $400 | Redeem by registering for a course | Expires Monday 5pm PT

This course is designed for

ML Engineers & ML Researchers

ML Engineers

who build and maintain ML services and pipelines for their organization

Champion Transformer models in your organization, finding opportunities to use them to solve business problems.

Apply the abstractions and utilities that are available for simplifying the ML project lifecycle with Transformers.

Connect to a vibrant network of researchers and engineers who value collaboration and open science.

ML Researchers

who are looking to bring the latest technologies and techniques into their lab or organization

Apply Transformers models in their research, and mentor peers in doing the same.

Reflect on the current (and future) developments of Transformers, searching for connections to domains that have not yet been explored.

Meet your

Instructors

Lewis Tunstall

Machine Learning Engineer

Lewis Tunstall is a machine learning engineer at Hugging Face, currently focused on optimizing Transformers for production workloads and researching novel techniques to train these models efficiently. He is also a co-author of the best-selling book “Natural Language Processing with Transformers” and has taught dozens of workshops on the topic to enterprises, universities, and the machine learning community at large.

Before joining Hugging Face, Lewis shipped machine learning powered applications for Swiss enterprises in the domains of natural language processing, time series, and topological data analysis. He has a PhD in Physics and has held research appointments at premier institutions in Australia, the United States, and Switzerland.

Nima Boscarino

ML Developer Advocate

Nima Boscarino is a Developer Advocate at Hugging Face, where he helps community members make the most of the Hub, Transformers, Gradio, and the rest of the Hugging Face toolchain. Between creating educational content, facilitating workshops, and contributing to open source projects, he’s in his happy place!

In a prior life, Nima worked as a software engineer and educator, and he continues to draw on those experiences to promote socially conscious and responsible technologies.

Leandro von Werra

Machine Learning Engineer

Leandro von Werra is a machine learning engineer at Hugging Face, where he works on model evaluation and is the maintainer of 🤗 Evaluate. In addition, he works on machine learning for code and developed the open-source CodeParrot models.

He has several years of industry experience bringing NLP projects to production by working across the whole machine learning stack. And he is the creator of a popular Python library called TRL that combines Transformers with reinforcement learning.

About Hugging Face’s

Live Cohort

Transformer-based models have taken the Machine Learning world by storm! Immediately revolutionizing Natural Language Processing, Transformers continue to make an impact for both cutting-edge research and industry applications. The successes of Transformer models like BERT and GPT have kickstarted a rapidly growing ecosystem of models and techniques, and data-driven organizations are wary of being left behind. Hugging Face exploits the properties of Transformer models to make it easy for individuals to “finetune” their own models, lowering the barrier for research and allowing traditional software engineers to include machine learning functionalities into their stacks. This course equips ML practitioners with the knowledge, skills, and tools that they’ll need to use state-of-the-art models to solve their problems.

What key features have made Transformer models so successful in NLP?

How can I use Transformer models to solve problems in my domain?

What technical considerations do I need to address in order to develop and deploy my own Transformer models?

What does the future hold for Transformers?

How can I keep up and integrate with the latest Transformer models?

Session 1 - Hello Transformers

Monday, September 19
8-10 AM (PST)

In session one, the instructor will cover the importance and uses of Transformer models in the NLP ecosystem. At the end, learners will be able to:

Compare and contrast the various historical methods in NLP that led to Transformer-based models

Map the high-level intuition behind Transformer models to their impact and use-cases in NLP tasks

Construct a demo using the Hugging Face stack to showcase a model

Session 2 - Training Transformers from Scratch

Wednesday, September 21
8-10 AM (PST)

In session two, we’ll talk about using and fine-tuning Transformer models. By the end, learners will be able to:

Use an end-to-end inference pipeline

Distinguish the components that make up a Hugging Face Transformers inference pipeline

Invent an NLP problem and develop a solution by carrying out fine-tuning with a relevant model and dataset

Session 3 - Transformer Anatomy

Monday, September 26
8-10 AM (PST)

In session three, the instructor will explore how Transformers “understand” language data by using the attention mechanism to create embeddings, and the role that embeddings play in NLP. Learners will leave the session able to:

Recommend a particular NLP “task” to address a given problem

Demonstrate embeddings in action by building a Q&A search engine, leveraging an understanding of attention and contextualized representations

Experiment with loading, exploring, and processing datasets from the Hugging Face Hub

Session 4 - Making Transformers Efficient in Production

Wednesday, September 28
8-10 AM (PST)

During the fourth session, we’ll discuss developing and deploying Transformer models in enterprise settings. Afterwords, learners will be able to:

Design a ML stack and workflow to tackle a given “enterprise”-level problem

Differentiate between the various ML deployment options, explaining their benefits and drawbacks

Experiment with optimization methods to squeeze performance from a Transformer model

Session 5 - Beyond NLP: Future Directions

Monday, October 3
8-10 AM (PST)

In this final session, the instructor will go beyond NLP to discuss implications of Transformers in other domains and the effects they have on our society. Learners will leave knowing how to:

Research some of the applications of Transformers in modalities outside of NLP

Argue for the importance of considering the ethical implications of large language models

Given what we know about the effects of scaling models to large sizes and where Transformers are today, imagine the next five years of NLP and ML progress and impact

Still have questions?

We’re here to help!

Do I have to attend all of the sessions live in real-time?

You don’t! We record every live session in the cohort and make each recording and the session slides available on our portal for you to access anytime.

Will I receive a certificate upon completion?

Each learner receives a certificate of completion, which is sent to you upon completion of the cohort (along with access to our Alumni portal!). Additionally, Sphere is listed as a school on LinkedIn so you can display your certificate in the Education section of your profile.!

Is there homework?

Throughout the cohort, there may be take-home questions that pertain to subsequent sessions. These are optional, but allow you to engage more with the instructor and other cohort members!

Can I get the course fee reimbursed by my company?

While we cannot guarantee that your company will cover the cost of the cohort, we are accredited by the Continuing Professional Development (CPD) Standards Office, meaning many of our learners are able to expense the course via their company or team’s L&D budget. We even provide an email template you can use to request approval.

I have more questions, how can I get in touch?

Please reach out to us via our Contact Form with any questions. We’re here to help!

Book a time to talk with the Sphere team