## 🚀 Feature Please enable `fp8e4m3` in torch_xla. This feature is in flight in openxla: https://github.com/openxla/xla/pull/16585/files Today, PyTorch doesn't support fp8e4m3 yet, only the `funz` variants are supported. @amithrm wants to see this feature as an alternative to `fp8e4m3fn`. cc @amithrm @JackCaoG