Skip to content

Commit 922cc2f

Browse files
committed
add
Signed-off-by: Chris Abraham <[email protected]>
1 parent 3f99cb7 commit 922cc2f

File tree

14 files changed

+91
-7
lines changed

14 files changed

+91
-7
lines changed

_community_stories/15.md

Lines changed: 0 additions & 7 deletions
This file was deleted.

_community_stories/43.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'Using deep learning and PyTorch to power next gen aircraft at Caltech'
3+
ext_url: https://www.youtube.com/watch?v=se206WBk2dM
4+
date: Nov 14, 2019
5+
tags: ["Research", "Aeorospace"]
6+
---
7+
Learn how Caltech’s Center for Autonomous Systems and Technologies (CAST) uses PyTorch to build deep learning systems that can understand the aerodynamics of how aircrafts interact with the ground to enable much smoother and safer landings.

_community_stories/44.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'Deepset achieves a 3.9x speedup and 12.8x cost reduction for training NLP models by working with AWS and NVIDIA'
3+
ext_url: https://aws.amazon.com/blogs/machine-learning/deepset-achieves-a-3-9x-speedup-and-12-8x-cost-reduction-for-training-nlp-models-by-working-with-aws-and-nvidia/
4+
date: Jan 27, 2021
5+
tags: ["Research", "NLP"]
6+
---
7+
At deepset, we’re building the next-level search engine for business documents. Our core product, Haystack, is an open-source framework that enables developers to utilize the latest NLP models for semantic search and question answering at scale. Our software as a service (SaaS) platform, Haystack Hub, is used by developers from various industries, including finance, legal, and automotive, to find answers in all kinds of text documents. You can use these answers to improve the search experience, cover the long-tail of chat bot queries, extract structured data from documents, or automate invoicing processes.

_community_stories/45.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'PyTorch at Dolby Labs'
3+
ext_url: https://www.youtube.com/watch?v=K5hD0et_wUc&list=PL_lsbAsL_o2BY-RrqVDKDcywKnuUTp-f3&index=20
4+
date: Nov 6, 2019
5+
tags: ["Research", "NLP"]
6+
---
7+
Hear how Dolby Labs is using PyTorch to develop deep learning for audio, and learn about the challenges that audio AI presents and the breakthroughs and applications they’ve built at Dolby to push the field forward.

_community_stories/46.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'Using a Grapheme to Phoneme Model in Cisco’s Webex Assistant'
3+
ext_url: https://blogs.cisco.com/developer/graphemephoneme01
4+
date: September 7, 2021
5+
tags: ["Research", "NLP"]
6+
---
7+
Grapheme to Phoneme (G2P) is a function that generates pronunciations (phonemes) for words based on their written form (graphemes). It has an important role in automatic speech recognition systems, natural language processing, and text-to-speech engines. In Cisco’s Webex Assistant, we use G2P modelling to assist in resolving person names from voice. See here for further details of various techniques we use to build robust voice assistants.

_community_stories/47.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'AI21 Labs Trains 178-Billion-Parameter Language Model Using Amazon EC2 P4d Instances, PyTorch'
3+
ext_url: https://aws.amazon.com/solutions/case-studies/AI21-case-study-p4d/
4+
date: June 7, 2021
5+
tags: ["Research", "NLP"]
6+
---
7+
AI21 Labs uses machine learning to develop language models focused on understanding meaning, and in 2021 it set a goal to train the recently released Jurassic-1 Jumbo, an autoregressive language model with 178 billion parameters. Developers who register for beta testing will get access to Jurassic-1 Jumbo and can immediately start to customize the model for their use case. The software startup wanted to train the model efficiently, so it looked to Amazon Web Services (AWS) and built a solution using Amazon Elastic Compute Cloud (Amazon EC2), a web service that provides secure, resizable compute capacity in the cloud. Choosing Amazon EC2 gave the company control over the training process, including node allocation.

_community_stories/48.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'The Why and How of Scaling Large Language Models'
3+
ext_url: https://www.youtube.com/watch?v=qscouq3lo0s
4+
date: Jan 4, 2022
5+
tags: ["Research", "NLP"]
6+
---
7+
Anthropic is an AI safety and research company that’s working to build reliable, interpretable, and steerable AI systems. Over the past decade, the amount of compute used for the largest training runs has increased at an exponential pace. We've also seen in many domains that larger models are able to attain better performance following precise scaling laws. The compute needed to train these models can only be attained using many coordinated machines that are communicating data between them. In this talk, Nicholas Joseph (Technical Staff, Anthropic) goes through why and how they can scale up training runs to use these machines efficiently.

_community_stories/49.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'University of Pécs enables text and speech processing in Hungarian, builds the BERT-large model with just 1,000 euro with Azure'
3+
ext_url: https://www.microsoft.com/en/customers/story/1402696956382669362-university-of-pecs-higher-education-azure-en-hungary
4+
date: August 10, 2021
5+
tags: ["Research", "NLP"]
6+
---
7+
Everyone prefers to use their mother tongue when communicating with chat agents and other automated services. However, for languages like Hungarian—spoken by only 15 million people—the market size will often be viewed as too small for large companies to create software, tools or applications that can process Hungarian text as input. Recognizing this need, the Applied Data Science and Artificial Intelligence team from University of Pécs decided to step up. Using Microsoft AI Solutions and ONNX Runtime solutions, it built and trained its own BERT-large model in native Hungarian in under 200 hours and total build cost of 1,000 euro.

_community_stories/50.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'Mapillary Research: Seamless Scene Segmentation and In-Place Activated BatchNorm'
3+
ext_url: /blog/mapillary-research/
4+
date: July 23, 2019
5+
tags: ["Research"]
6+
---
7+
With roads in developed countries like the US changing up to 15% annually, Mapillary addresses a growing demand for keeping maps updated by combining images from any camera into a 3D visualization of the world. Mapillary’s independent and collaborative approach enables anyone to collect, share, and use street-level images for improving maps, developing cities, and advancing the automotive industry.

_community_stories/51.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
title: 'How 3DFY.ai Built a Multi-Cloud, Distributed Training Platform Over Spot Instances with TorchElastic and Kubernetes'
3+
ext_url: https://medium.com/pytorch/how-3dfy-ai-built-a-multi-cloud-distributed-training-platform-over-spot-instances-with-44be40936361
4+
date: Jun 17, 2021
5+
tags: ["Research"]
6+
---
7+
Deep Learning development is becoming more and more about minimizing the time from idea to trained model. To shorten this lead time, researchers need access to a training environment that supports running multiple experiments concurrently, each utilizing several GPUs.

0 commit comments

Comments
 (0)