Skip to content

Commit 726db50

Browse files
authored
Merge branch 'site' into fix-previous-versions
2 parents e24a354 + f51c0fd commit 726db50

File tree

204 files changed

+6383
-1237
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

204 files changed

+6383
-1237
lines changed

.github/workflows/build.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ on:
88

99
jobs:
1010
tests:
11-
uses: pytorch/test-infra/.github/workflows/linux_job.yml@main
11+
uses: pytorch/test-infra/.github/workflows/linux_job_v2.yml@main
1212
secrets: inherit
1313
with:
1414
runner: linux.12xlarge

.github/workflows/update-quick-start-module.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ jobs:
6363
update-quick-start:
6464
needs: [linux-nightly-matrix, windows-nightly-matrix, macos-arm64-nightly-matrix,
6565
linux-release-matrix, windows-release-matrix, macos-arm64-release-matrix]
66-
runs-on: "ubuntu-20.04"
66+
runs-on: "ubuntu-latest"
6767
environment: pytorchbot-env
6868
steps:
6969
- name: Checkout pytorch.github.io

CNAME

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
pytorch.org
1+
docs.pytorch.org

_community_blog/optimize-llms.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
title: "Optimize LLMs for Efficiency & Sustainability"
3+
ext_url: /blog/optimize-llms/
4+
date: Feb 19, 2025
5+
author: "Zach Lasiuk, Arm"
6+
---
7+
8+
The rapid growth of large language model (LLM) applications is linked to rapid growth in energy demand. According to the International Energy Agency (IEA), data center electricity consumption is projected to roughly double by 2026 primarily driven by AI. This is due to the energy-intensive training requirements for massive LLMs – however, the increase in AI Inferencing workloads also plays a role. For example, compared with traditional search queries, a single AI inference can consume about [10x more energy](https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/).
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
---
2+
title: "Powering AI with PyTorch, Fedora, and Open Source Communities"
3+
author: Sudhir Dharanendraiah
4+
ext_url: /blog/pt-fedora-os-communities/
5+
date: Mar 7, 2025
6+
---
7+
8+
At [DevConf.IN 2025](https://www.devconf.info/in/) in Pune, I had the opportunity to host a **[PyTorch Meetup](https://pretalx.devconf.info/devconf-in-2025/talk/W3YURM/)** on February 28th. The session, titled "**Powering AI with PyTorch, Fedora, and Open Source Communities**" was aimed at introducing PyTorch to students and professionals, explaining why **PyTorch+Fedora** form an ideal AI development platform. The other key aspect I covered was collaboration between open source communities.
9+

_community_blog/pytorch-at-gtc.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
title: "PyTorch at GTC 2025"
3+
author: "Team PyTorch at NVIDIA"
4+
ext_url: /blog/pytorch-at-gtc/
5+
date: Mar 16, 2025
6+
---
7+
8+
[GTC](https://www.nvidia.com/gtc/) is coming back to San Jose on March 17–21, 2025. Join PyTorch Foundation members Arm, AWS, Google Cloud, IBM, Lightning AI, Meta, Microsoft Azure, Snowflake, and thousands of developers as we celebrate PyTorch. Together learn how AI & accelerated computing are helping humanity solve our most complex challenges.
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
title: "SGLang Joins PyTorch Ecosystem: Efficient LLM Serving Engine"
3+
author: "SGLang Team"
4+
ext_url: /blog/sglang-joins-pytorch/
5+
date: Mar 19, 2025
6+
---
7+
8+
We’re thrilled to announce that the SGLang project has been integrated into the PyTorch ecosystem! This integration ensures that SGLang aligns with PyTorch’s standards and practices, providing developers with a reliable and community-supported framework for fast and flexible serving of LLMs.

_community_stories/57.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
title: 'How IBM Research Uses PyTorch and TerraTorch to Make Geospatial Computer Vision Accessible for Everyone'
3+
ext_url: /blog/how-ibm-uses-pt-terratorch/
4+
date: May 1, 2025
5+
tags: ["Computer Vision"]
6+
---
7+
8+
Geospatial computer vision is essential for understanding our planet — from monitoring deforestation to tracking urban development and analyzing the impacts of climate change. However, the coding and deep learning skills for applying AI models to satellite imagery and earth observation data has traditionally been a major barrier for many practitioners.
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
---
2+
category: event
3+
title: "Towards Autonomous Language Model Systems"
4+
date: May 21, 2025
5+
poster: assets/images/pt-day-cfp.png
6+
---
7+
8+
<a href="/autonomous-language-model-systems">
9+
<img style="width:100%" src="/assets/images/autonomous-language-model-systems.png" alt="Towards Autonomous Language Model Systems">
10+
</a>
11+
12+
**Date**: May 21, 2025, 11AM PT / 2PM ET
13+
**Location**: Online
14+
15+
Language models (LMs) are increasingly used to assist users in day-to-day tasks such as programming (Github Copilot) or search (Google's AI Overviews). But can we build language model systems that are able to autonomously complete entire tasks end-to-end?
16+
17+
In this talk, Ofir Press will discuss efforts to build autonomous LM systems, focusing on the software engineering domain. Ofir will present SWE-bench, a novel method for measuring AI systems on their abilities to fix real issues in popular software libraries. Ofir will then discuss SWE-agent, a system for solving SWE-bench tasks.
18+
19+
SWE-bench and SWE-agent are used by many leading AI organizations in academia and industry, including OpenAI, Anthropic, Meta, and Google, and SWE-bench has been downloaded over 2 million times. These projects show that academics on tight budgets can have a substantial impact in steering the research community toward building autonomous systems that can complete challenging tasks.
20+
21+
Ofir is a postdoc at Princeton University, where they mainly work with Karthik Narasimhan's lab. Ofir previously completed their PhD at the University of Washington in Seattle, where Ofir was advised by Noah Smith. During their PhD, Ofir spent two years at Facebook AI Research Labs on Luke Zettlemoyer's team.
22+
23+
[Register Now](/autonomous-language-model-systems)

_events/ce1.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
category: event
3+
title: "COLING 2025"
4+
date: Jan 19, 2025
5+
---
6+
<span class="community-event">Community Event</span>
7+
8+
**Date**: Jan 19 - 25, 2025
9+
10+
COLING, the International Conference on Computational Linguistics, is one of the premier conferences for the natural language processing and computational linguistics.
11+
12+
First established in 1965, the biennial COLING conference is held in diverse parts of the globe and attracts participants from both top-ranked research centers and emerging countries. Today, the most important developments in our field are taking place not only in universities and academic research institutes but also in industrial research departments including tech-startups. COLING provides opportunities for all these communities to showcase their exciting discovery.
13+
14+
[Learn more about this event](https://coling2025.org/)

0 commit comments

Comments
 (0)