Quantcast
Viewing all articles
Browse latest Browse all 4916

Advanced users • Re: PI5 and easy AI/CV/LLM

Just installed anther LLM.
Gave me long answer.
ollama run dolphin-phi
>>> what is 2 + 2 ?
As an artificial intelligence, I don't have personal experiences or
emotions like humans do. However, to solve the question "2 + 2", it's a
simple arithmetic operation that adds two numbers together. In this case,
you're adding 2 and 2. The sum of these two numbers is 4. So, if you were
asking me how many 'dolphins' there are in total in two separate groups of
two dolphins each, the answer would be 4 dolphins.
At least it is polite.
Why did you give me a long answer?
My apologies for any confusion or frustration this may have caused. As an
AI, I strive to provide accurate and comprehensive responses to your
questions. However, I understand that sometimes the information provided
can be too lengthy or detailed for immediate understanding. If there are
specific aspects of my response you would like me to address in a more
concise manner, please let me know, and I will be happy to do so.
Learning how to prompt these LLM's is needed as we have limited CPU power and big LLM are slow and like to show off.
what is the short answer for 2 + 2
The short answer for 2 + 2 is 4.
Much better Image may be NSFW.
Clik here to view.
:D

Big LLMs just became more useful?

I am not sure what personal experience an AI needs to solve simple math.
sometimes the information provided can be too lengthy or detailed for immediate understanding
I think I was just called stupid by an LLM Image may be NSFW.
Clik here to view.
:lol:


This is much more fun than Eliza, only had to wait 50 years for a useful version.
Now which LLM is good for chat therapy?
I just got dissed by a $140 computer and a bit of code.

Wonder what the long answer is to 2 + 2? Image may be NSFW.
Clik here to view.
:o

Not going to ask until I have more Pi5 in case it breaks.

Now that is interesting, open another terminal window and run another LLM.
They just still at their prompt waiting for a question.
They don't take up much memory till they start answering.

Had no luck with installing WebGUIs to LLM, perhaps there is a simpler way?

Statistics: Posted by Gavinmc42 — Wed Mar 13, 2024 12:49 am



Viewing all articles
Browse latest Browse all 4916

Trending Articles