It requires digging to find out how things work and to optimise things better for our Pi's.
Onnxstream version of SD XL is much faster than the Easy Diffusion installed SD.
Under that it uses XNNPACK, when compiled for Pi5 makes a 4.7MB libXXNPACK.a file
XNNPACK can accelerate frameworks
Above those frameworks is the LLMs etc, I think.
Not sure how many AI/LLMs pytorch installs I have now but I am down to only 10% left on my 450GB SSD.
Time to erase and start again.
Why Free Pascal?
There is a NN lib for Lazarus called CAI but it is x86 optimised.
There is also TONNXRuntime library for Lazarus/Delphi, got a compile error I need to figure out.
Ultibo can use C compiled lib.a files.
I should be able to make an Ultibo AI baremetal OS without needing Linux, maybe
An Embedded AI OS should run better than and faster than big bloated Linux/Windows OS and be far easier to maintain.
Small and fast enough to run on my old Pi's?
The big AI guys have their Cuda, H100, Groq server farms, I have a Pi5 and a much smaller budget
I think I am having more fun.
This is basically a learning exercise for me.
To answer my question "Is local AI of any use for a home hobbyist?".
The answer is a yes so far.
I have a few more home use cases like a 40+ year old dream of a robot lawnmower.
Sure I could do it with RTK-GPS but I have lots of trees and those RTK units are not that cheap.
Pi cameras are cheaper and the Pi5 has two camera inputs so stereo vision is possible.
Will auto lawnmower OS fit in a 2GB Pi5?
If Apple's reaLM is real then it might fit in a Zero2.
Things can only get better as more SBCs come out with NPUs.
Onnxstream version of SD XL is much faster than the Easy Diffusion installed SD.
Under that it uses XNNPACK, when compiled for Pi5 makes a 4.7MB libXXNPACK.a file
XNNPACK can accelerate frameworks
The trick is to get those frameworks to use the optimised lib.a file.XNNPACK is a highly optimized solution for neural network inference on ARM, x86, WebAssembly, and RISC-V platforms. XNNPACK is not intended for direct use by deep learning practitioners and researchers; instead it provides low-level performance primitives for accelerating high-level machine learning frameworks, such as TensorFlow Lite, TensorFlow.js, PyTorch, ONNX Runtime, and MediaPipe.
Above those frameworks is the LLMs etc, I think.
Not sure how many AI/LLMs pytorch installs I have now but I am down to only 10% left on my 450GB SSD.
Time to erase and start again.
Why Free Pascal?
There is a NN lib for Lazarus called CAI but it is x86 optimised.
There is also TONNXRuntime library for Lazarus/Delphi, got a compile error I need to figure out.
Ultibo can use C compiled lib.a files.
I should be able to make an Ultibo AI baremetal OS without needing Linux, maybe
An Embedded AI OS should run better than and faster than big bloated Linux/Windows OS and be far easier to maintain.
Small and fast enough to run on my old Pi's?
The big AI guys have their Cuda, H100, Groq server farms, I have a Pi5 and a much smaller budget
I think I am having more fun.
This is basically a learning exercise for me.
To answer my question "Is local AI of any use for a home hobbyist?".
The answer is a yes so far.
I have a few more home use cases like a 40+ year old dream of a robot lawnmower.
Sure I could do it with RTK-GPS but I have lots of trees and those RTK units are not that cheap.
Pi cameras are cheaper and the Pi5 has two camera inputs so stereo vision is possible.
Will auto lawnmower OS fit in a 2GB Pi5?
If Apple's reaLM is real then it might fit in a Zero2.
Things can only get better as more SBCs come out with NPUs.
Statistics: Posted by Gavinmc42 — Sat Apr 06, 2024 5:15 am