BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

10th July 2024 | 15:00 - 16:30 CEST | online

Summary

Join us for a practical exploration of BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, a foundational method in language modeling and understanding by Devlin et al. (2018).

Requirements

  • Join the AI Maker Community Slack Workspace: The communication during the session will happen through our Slack Workspace, in the #ai-maker-sessions channel: https://join.aimaker.community

  • Participants will need to have a Kaggle account, as this will be a practical exploration using Kaggle Notebooks. Make sure to verify your account, so that you can access the GPUs on the platform.

Event Details

This is an online event. We will post a link to the session in Slack shortly before the session starts.

See also