• >
  • >
  • >
  • >
  • >
Google > 实例探究 > Kakao Brain: Accelerating large-scale natural language processing and AI development with Cloud TPU

Kakao Brain: Accelerating large-scale natural language processing and AI development with Cloud TPU

Google Logo
公司规模
Large Corporate
地区
  • Asia
国家
  • Korea
产品
  • KoGPT
  • Google Cloud TPU
技术栈
  • Tensor Processing Unit (TPU)
  • Generative Pre-trained Transformer 3 (GPT-3)
实施规模
  • Enterprise-wide Deployment
影响指标
  • Productivity Improvements
  • Innovation Output
技术
  • 分析与建模 - 机器学习
  • 分析与建模 - 自然语言处理 (NLP)
适用功能
  • 产品研发
用例
  • Generative AI
  • 机器翻译
服务
  • 云规划/设计/实施服务
关于客户
Established in 2017, Kakao Brain is a research and development company that develops AI-based technologies in various fields, such as natural language processing. The company recently released a Korean natural language processing model, KoGPT, to help further expand the use and value of AI. As a subsidiary of Kakao Corp., a major South Korean tech giant, Kakao Brain focuses on advancing AI technologies to enhance various applications, including natural language processing. The company is committed to pushing the boundaries of AI research and development, leveraging cutting-edge technologies to create innovative solutions. With a strong emphasis on AI and machine learning, Kakao Brain aims to contribute significantly to the field by developing models that can process and understand complex data, particularly in the Korean language. Their efforts are geared towards creating AI models that can perform tasks such as reading user intentions, writing letters, and even software coding, thereby expanding the scope of AI applications in the Korean language.
挑战
In November 2021, Kakao Brain, an artificial intelligence R&D subsidiary of South Korean tech giant Kakao Corp., unveiled KoGPT. A large-scale deep learning-based natural language processing model, KoGPT was developed by adapting Generative Pre-trained Transformer 3 (GPT-3), the most widely used natural language processing model, to the Korean language. When it comes to the English language, GPT-3 is already expanding the scope of application beyond simply translating words into text, by accurately reading a user’s intentions and writing letters, even software coding. This was not available for the Korean language because the process of creating a NLG machine learning model is labor intensive, with rapid learning of large-scale data required. However, KoGPT was able to process six billion model parameters and 200 billion tokens, creating an artificial intelligence model that can understand Korean.
解决方案
Deploying a dedicated machine learning processor optimized for learning large-scale data, according to Woonhyuk Baek, Large-Scale AI Research Scientist at Kakao Brain, Google Cloud TPU plays an important part in accelerating the training process of KoGPT and its massive workloads. Baek goes on to explain that understanding the characteristics of GPU (Graphical Processing Unit) and TPU (Tensor Processing Unit) is the most important starting point for proper utilization. Although TPU has strong AI data processing capabilities, simply replacing all AI systems with TPU immediately will not yield the results wanted. TPU and GPU also have clear areas that complement each other. GPU has the advantages of being able to start a project quickly, and easily respond to a general environment, but it is not easy to scale. Meanwhile, TPU is easy to manage because it can receive resources in units of pods, and the communication speed between each node is fast, which is integral for large-scale data processing. Unlike on-premise and cloud compatible GPU, Cloud TPU was born to accelerate machine learning workloads within the Google Cloud ecosystem. Baek says on-demand TPU devices and pod slices provided ease with workload management, adding that fast networking speeds between TPU nodes made data processing seamless.
运营影响
  • Massively reduces workload of processing large-scale data.
  • Shortens task completion time from seven days to one day.
  • Enables seamless large-scale system scalability.
  • Provides flexibility and reliability for AI research and development.
  • Allows Kakao Brain to draw a clearer product development roadmap for future goals.
数量效益
  • Shortens task completion time from seven days to one day.
  • Processes six billion model parameters and 200 billion tokens.

Case Study missing?

Start adding your own!

Register with your work email and create a new case study profile for your business.

Add New Record

相关案例.

联系我们

欢迎与我们交流!
* Required
* Required
* Required
* Invalid email address
提交此表单,即表示您同意 Asia Growth Partners 可以与您联系并分享洞察和营销信息。
不,谢谢,我不想收到来自 Asia Growth Partners 的任何营销电子邮件。
提交

感谢您的信息!
我们会很快与你取得联系。