Skip to content

Commit

Permalink
Fix broken links (#20831)
Browse files Browse the repository at this point in the history
### Description
Fix broken links referenced in issue #20790 

Llama2 blog link was updated to the correct location and ResNet sample
was removed entirely because it no longer exists anywhere in the AML
docs

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
  • Loading branch information
sophies927 authored May 28, 2024
1 parent e3e4f21 commit aa6097a
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 2 deletions.
1 change: 0 additions & 1 deletion docs/ecosystem/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ ONNX Runtime functions as part of an ecosystem of tools and platforms to deliver

## Azure Machine Learning Services
* [Azure Container Instance: BERT](https://github.com/microsoft/onnxruntime/tree/main/onnxruntime/python/tools/transformers/notebooks/Inference_Bert_with_OnnxRuntime_on_AzureML.ipynb){:target="_blank"}
* [Azure Container Instance: Image classification (Resnet)](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/deployment/onnx/onnx-modelzoo-aml-deploy-resnet50.ipynb){:target="_blank"}
* [Azure Kubernetes Services: FER+](https://github.com/microsoft/onnxruntime/blob/main/docs/python/notebooks/onnx-inference-byoc-gpu-cpu-aks.ipynb){:target="_blank"}
* [Azure IoT Sedge (Intel UP2 device with OpenVINO)](https://github.com/Azure-Samples/onnxruntime-iot-edge/blob/master/AzureML-OpenVINO/README.md){:target="_blank"}
* [Automated Machine Learning](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features/auto-ml-classification-bank-marketing-all-features.ipynb){:target="_blank"}
Expand Down
2 changes: 1 addition & 1 deletion src/routes/blogs/accelerating-llama-2/+page.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@

<p class="mb-4">
Here is a <a
href="https://github.com/microsoft/onnxruntime-inference-examples/blob/main/python/models/llama2/LLaMA-2%20E2E%20Notebook.ipynb"
href="https://github.com/microsoft/onnxruntime-inference-examples/blob/main/python/models/llama/LLaMA-2%20E2E%20Notebook.ipynb"
class="text-blue-500">sample notebook</a
> that shows you an end-to-end example of how you can use the above ONNX Runtime optimizations
in your application.
Expand Down

0 comments on commit aa6097a

Please sign in to comment.