#gpu-acceleration

[ follow ]
fromInfoQ
1 month ago

GPULlama3.java Brings GPU-Accelerated LLM Inference to Pure Java

The TornadoVM programming guide demonstrates how developers can utilize hardware-agnostic APIs, enabling the same Java source code to run identically on various hardware accelerators.
Java
fromHackernoon
2 months ago

Supercharge ML: Your Guide to GPU-Accelerated cuML and XGBoost | HackerNoon

GPU acceleration can greatly enhance traditional machine learning workflows
Focus on cuML, XGBoost, and dimensionality reduction techniques for efficient data processing
fromHackernoon
2 months ago

Achieve 100x Speedups in Graph Analytics Using Nx-cugraph | HackerNoon

Leveraging GPU acceleration can significantly improve performance in graph analytics.
nx-cugraph provides a solution for enhancing NetworkX workflows on larger datasets.
fromInfoQ
2 months ago

Google Enhances LiteRT for Faster On-Device Inference

LiteRT, previously TensorFlow Lite, enhances on-device ML inference by simplifying GPU and NPU integration, achieving up to 25x speed improvements and lower power usage.
Artificial intelligence
Node JS
fromGitHub
4 months ago

GitHub - gezilinll/pica-gpu: a high-quality, GPU-accelerated image resizer

Pica-gpu utilizes GPU acceleration for efficient image resizing, significantly improving performance and reducing CPU usage.
The library features multiple filtering algorithms, ensuring high-quality image processing.
Data science
fromHackernoon
5 months ago

How Panopticus Uses AI to Detect Objects in 3D | HackerNoon

Panopticus improves omnidirectional 3D object detection using multi-branch models and optimized GPU-based execution.
fromtowardsdatascience.com
5 months ago
Artificial intelligence

Breaking the Bottleneck: GPU-Optimised Video Processing for Deep Learning

Deep learning applications can improve performance by minimizing CPU-GPU transfer bottlenecks through GPU-accelerated video decoding.
[ Load more ]