Enroll now to learn live with the Hugging Face team.
Format: 5 x 2 hr live workshops (+ recordings of each)
Sold Out: New dates coming soon
Price: $800 per seat (expense through L&D)
who build and maintain ML services and pipelines for their organization
Champion Transformer models in your organization, finding opportunities to use them to solve business problems.
Apply the abstractions and utilities that are available for simplifying the ML project lifecycle with Transformers.
Connect to a vibrant network of researchers and engineers who value collaboration and open science.
who are looking to bring the latest technologies and techniques into their lab or organization
Apply Transformers models in their research, and mentor peers in doing the same.
Reflect on the current (and future) developments of Transformers, searching for connections to domains that have not yet been explored.
Transformer-based models have taken the Machine Learning world by storm! Immediately revolutionizing Natural Language Processing, Transformers continue to make an impact for both cutting-edge research and industry applications. The successes of Transformer models like BERT and GPT have kickstarted a rapidly growing ecosystem of models and techniques, and data-driven organizations are wary of being left behind. Hugging Face exploits the properties of Transformer models to make it easy for individuals to “finetune” their own models, lowering the barrier for research and allowing traditional software engineers to include machine learning functionalities into their stacks. This course equips ML practitioners with the knowledge, skills, and tools that they’ll need to use state-of-the-art models to solve their problems.
What key features have made Transformer models so successful in NLP?
How can I use Transformer models to solve problems in my domain?
What technical considerations do I need to address in order to develop and deploy my own Transformer models?
What does the future hold for Transformers?
How can I keep up and integrate with the latest Transformer models?
Session 1 - Hello Transformers
In session one, the instructor will cover the importance and uses of Transformer models in the NLP ecosystem. At the end, learners will be able to:
Compare and contrast the various historical methods in NLP that led to Transformer-based models
Map the high-level intuition behind Transformer models to their impact and use-cases in NLP tasks
Construct a demo using the Hugging Face stack to showcase a model
Session 2 - Training Transformers from Scratch
In session two, we’ll talk about using and fine-tuning Transformer models. By the end, learners will be able to:
Use an end-to-end inference pipeline
Distinguish the components that make up a Hugging Face Transformers inference pipeline
Invent an NLP problem and develop a solution by carrying out fine-tuning with a relevant model and dataset
Session 3 - Transformer Anatomy
In session three, the instructor will explore how Transformers “understand” language data by using the attention mechanism to create embeddings, and the role that embeddings play in NLP. Learners will leave the session able to:
Recommend a particular NLP “task” to address a given problem
Demonstrate embeddings in action by building a Q&A search engine, leveraging an understanding of attention and contextualized representations
Experiment with loading, exploring, and processing datasets from the Hugging Face Hub
Session 4 - Making Transformers Efficient in Production
During the fourth session, we’ll discuss developing and deploying Transformer models in enterprise settings. Afterwords, learners will be able to:
Design a ML stack and workflow to tackle a given “enterprise”-level problem
Differentiate between the various ML deployment options, explaining their benefits and drawbacks
Experiment with optimization methods to squeeze performance from a Transformer model
Session 5 - Beyond NLP: Future Directions
In this final session, the instructor will go beyond NLP to discuss implications of Transformers in other domains and the effects they have on our society. Learners will leave knowing how to:
Research some of the applications of Transformers in modalities outside of NLP
Argue for the importance of considering the ethical implications of large language models
Given what we know about the effects of scaling models to large sizes and where Transformers are today, imagine the next five years of NLP and ML progress and impact
You don’t! We record every live session in the cohort and make each recording and the session slides available on our portal for you to access anytime.
Each learner receives a certificate of completion, which is sent to you upon completion of the cohort (along with access to our Alumni portal!). Additionally, Sphere is listed as a school on LinkedIn so you can display your certificate in the Education section of your profile.!
Throughout the cohort, there may be take-home questions that pertain to subsequent sessions. These are optional, but allow you to engage more with the instructor and other cohort members!
While we cannot guarantee that your company will cover the cost of the cohort, we are accredited by the Continuing Professional Development (CPD) Standards Office, meaning many of our learners are able to expense the course via their company or team’s L&D budget. We even provide an email template you can use to request approval.