🚨 Black Friday Sale 🚨 Purchase a course through an individual or team subscription and get $400 | Redeem by registering for a course | Expires Monday 5pm PT

Designed to help make organizations

Data Driven

Product Leaders

Program managers and product managers that are focused on metrics like growth and revenue, and prioritization decisions

Data Scientists

Data Scientists and Data Science managers who help map strategic decisions to actionable experimental designs and then interpret the results in a trustworthy manner

The motivation and basics of A/B testing (e.g., causality, surprising examples, metrics, interpreting results, trust and pitfalls, Twyman’s law, A/A tests)
Cultural challenges, humbling results (e.g., failing often, pivoting, iterating), experimentation platform, institutional memory and meta-analysis, ethics
Hierarchy of evidence, Expected Value of Information (EVI), complementary techniques, risks in observational causal studies

Engineering Leaders

Engineering managers, directors, VPs, and CTOs who want to make their organizations data-driven with metrics and A/B tests

Safe deployments
Triggering, especially in evaluating machine learning models
The benefits of agile product development

Designed for ML Engineers & Researchers

who want to utilize Transformers to build reliable and scalable services

You will:
  • Champion Transformer models in your organization, finding opportunities to use them to solve business problems
  • Apply the abstractions and utilities that are available for simplifying the ML project lifecycle with Transformers
  • Connect to a vibrant network of researchers and engineers who value collaboration and open science
You should:
  • Have 3+ years of experience in either product OR engineering roles
  • Have experience creating and/or working with machine learning algorithms in industry settings

Learn live from a world-class

Instructor

Learn live from a world-class

Instructor

Lewis Tunstall is a machine learning engineer at Hugging Face, currently focused on optimizing Transformers for production workloads and researching novel techniques to train these models efficiently. He is also a co-author of the best-selling book “Natural Language Processing with Transformers” and has taught dozens of workshops on the topic to enterprises, universities, and the machine learning community at large.

Before joining Hugging Face, Lewis shipped machine learning powered applications for Swiss enterprises in the domains of natural language processing, time series, and topological data analysis. He has a PhD in Physics and has held research appointments at premier institutions in Australia, the United States, and Switzerland.

Learn live from a world-class

Instructor

Learn live from a world-class

Instructor

Nima Boscarino is a Developer Advocate at Hugging Face, where he helps community members make the most of the Hub, Transformers, Gradio, and the rest of the Hugging Face toolchain. Between creating educational content, facilitating workshops, and contributing to open source projects, he’s in his happy place!

In a prior life, Nima worked as a software engineer and educator, and he continues to draw on those experiences to promote socially conscious and responsible technologies.

Learn live from a world-class

Instructor

Learn live from a world-class

Instructor

Leandro von Werra is a machine learning engineer at Hugging Face, where he works on model evaluation and is the maintainer of 🤗 Evaluate. In addition, he works on machine learning for code and developed the open-source CodeParrot models.

He has several years of industry experience bringing NLP projects to production by working across the whole machine learning stack. And he is the creator of a popular Python library called TRL that combines Transformers with reinforcement learning.

Learn live from world-class

Instructors

Lewis Tunstall is a machine learning engineer at Hugging Face, currently focused on optimizing Transformers for production workloads and researching novel techniques to train these models efficiently. He is also a co-author of the best-selling book “Natural Language Processing with Transformers” and has taught dozens of workshops on the topic to enterprises, universities, and the machine learning community at large.

Before joining Hugging Face, Lewis shipped machine learning powered applications for Swiss enterprises in the domains of natural language processing, time series, and topological data analysis. He has a PhD in Physics and has held research appointments at premier institutions in Australia, the United States, and Switzerland.

Nima Boscarino is a Developer Advocate at Hugging Face, where he helps community members make the most of the Hub, Transformers, Gradio, and the rest of the Hugging Face toolchain. Between creating educational content, facilitating workshops, and contributing to open source projects, he’s in his happy place!

In a prior life, Nima worked as a software engineer and educator, and he continues to draw on those experiences to promote socially conscious and responsible technologies.

Leandro von Werra is a machine learning engineer at Hugging Face, where he works on model evaluation and is the maintainer of 🤗 Evaluate. In addition, he works on machine learning for code and developed the open-source CodeParrot models.

He has several years of industry experience bringing NLP projects to production by working across the whole machine learning stack. And he is the creator of a popular Python library called TRL that combines Transformers with reinforcement learning.

Recommended by

Industry Experts

No items found.

Expense
the cost

90% of our learners expense the full cost of our courses to their employer. This includes leading startups and enterprises alike.

Check Expense Approval At Your Company

Exclusive Content

to advance your business

Get access to exclusive content through live sessions, meetups and our Student Portal (even after you finish the cohort). Ask questions and get personal feedback directly from your instructors and others taking the course.

Join a diverse and experienced

Community

This cohort gives you access to a rich community of like-minded professionals from some of the best businesses in the world. Even after the course ends, you will continue to learn and build with each other.

About Hugging Face's

Live Cohort

Transformer-based models have taken the Machine Learning world by storm! Immediately revolutionizing Natural Language Processing, Transformers continue to make an impact for both cutting-edge research and industry applications. The successes of Transformer models like BERT and GPT have kickstarted a rapidly growing ecosystem of models and techniques, and data-driven organizations are wary of being left behind. Hugging Face exploits the properties of Transformer models to make it easy for individuals to “finetune” their own models, lowering the barrier for research and allowing traditional software engineers to include machine learning functionalities into their stacks. This course equips ML practitioners with the knowledge, skills, and tools that they’ll need to use state-of-the-art models to solve their problems.

  • What key features have made Transformer models so successful in NLP?
  • How can I use Transformer models to solve problems in my domain?
  • What technical considerations do I need to address in order to develop and deploy my own Transformer models?
  • What does the future hold for Transformers?
  • How can I keep up and integrate with the latest Transformer models?

Session 1: Hello Transformers

In session one, the instructor will cover the importance and uses of Transformer models in the NLP ecosystem. At the end, learners will be able to:

  • Compare and contrast the various historical methods in NLP that led to Transformer-based models
  • Map the high-level intuition behind Transformer models to their impact and use-cases in NLP tasks
  • Construct a demo using the Hugging Face stack to showcase a model

Session 2: Training Transformers From Scratch

In session two, we’ll talk about using and fine-tuning Transformer models. By the end, learners will be able to:

  • Use an end-to-end inference pipeline
  • Distinguish the components that make up a Hugging Face Transformers inference pipeline
  • Invent an NLP problem and develop a solution by carrying out fine-tuning with a relevant model and dataset

Session 3: Transformer Academy

In session three, the instructor will explore how Transformers “understand” language data by using the attention mechanism to create embeddings, and the role that embeddings play in NLP. Learners will leave the session able to:

  • Recommend a particular NLP “task” to address a given problem
  • Demonstrate embeddings in action by building a Q&A search engine, leveraging an understanding of attention and contextualized representations
  • Experiment with loading, exploring, and processing datasets from the Hugging Face Hub

Session 4: Making Transformers Efficient in Production

During the fourth session, we’ll discuss developing and deploying Transformer models in enterprise settings. Afterwords, learners will be able to:

  • Design a ML stack and workflow to tackle a given “enterprise”-level problem
  • Differentiate between the various ML deployment options, explaining their benefits and drawbacks
  • Experiment with optimization methods to squeeze performance from a Transformer model

Session 5: Beyond NLP: Future Directions

In this final session, the instructor will go beyond NLP to discuss implications of Transformers in other domains and the effects they have on our society. Learners will leave knowing how to:

  • Research some of the applications of Transformers in modalities outside of NLP
  • Argue for the importance of considering the ethical implications of large language models
  • Given what we know about the effects of scaling models to large sizes and where Transformers are today, imagine the next five years of NLP and ML progress and impact

Still have questions?

We’re here to help!

Do I have to attend all of the sessions live in real-time?

You don’t! We record every live session in the cohort and make each recording and the session slides available on our portal for you to access anytime.

Will I receive a certificate upon completion?

Each learner receives a certificate of completion, which is sent to you upon completion of the cohort (along with access to our Alumni portal!). Additionally, Sphere is listed as a school on LinkedIn so you can display your certificate in the Education section of your profile.

Is there homework?

Throughout the cohort, there may be take-home questions that pertain to subsequent sessions. These are optional, but allow you to engage more with the instructor and other cohort members!

Can I get the course fee reimbursed by my company?

While we cannot guarantee that your company will cover the cost of the cohort, we are accredited by the Continuing Professional Development (CPD) Standards Office, meaning many of our learners are able to expense the course via their company or team’s L&D budget. We even provide an email template you can use to request approval.

I have more questions, how can I get in touch?

Please reach out to us via our Contact Form with any questions. We’re here to help!

Book a time to talk with the Sphere team

Join us for a next generation

Learning Experience