FreakyFwoof

@FreakyFwoof@universeodon.com

Musician, composer, husband, father, keyboard player, youtuber, teacher.
Lover of food and technology, Mac/Windows/iOS/Android.
#MusicIsLife #LoveMyWife

This profile is from a federated server and may be incomplete. Browse more on the original instance.

MutedTrampet, to random

MutedTrampet.exe was upgraded to version 2.4. No idea where the changelog went, though, probably got lost behind the shelves of disks in the MIDI room. Oh well.

FreakyFwoof,

@MutedTrampet Hope the UI didn't get an entirely new overhaul and break the universe in the process.

FreakyFwoof, to random

Anyone know how to get a hold of the creator behind the NVDA AI content describer? Would like to make it use Ollama as an option. Save some API calls.

FreakyFwoof,

@Bri That works. I'll try email first, don't get on with GitHub, I'm not a rocket scientist. I do music not programming and even find downloading, the most basic things from github sometimes, a ridiculous chore. Old-fashioned email is at least a possibility lol

FreakyFwoof,

@Bri Don't know if you've tried using Ollama or anything local yourself but it's very interesting and fully-fledged. Obviously not perfect, nothing is, but it's local, it's yours, you can ask it as much or as little as you like and it only costs you a few fan revs if even that. lol

FreakyFwoof,

@Bri OO nice. I'm just using my main Mac for it, and it's so much fun. Doesn't even slow Logic down when I'm doing heavy tasks.

FreakyFwoof,

@Bri Well here's something I can finally help you with, instead of you helping me. Chi told me how to do this:

Allow Ollama to be visible to other machines on the network:
launchctl setenv OLLAMA_HOST "0.0.0.0"

FreakyFwoof,

@Bri Run that in terminal, quit and reopen Ollama, boom, done. Now if I could get this to run at startup prior to Ollama, I'd be happy, or if there's some conf file I can modify so I don't have to do it every damn time, that would be great. Alas...

FreakyFwoof,

@Bri Oh really? Strange times. My machine is on ethernet not wireless, and i wonder if that has anything to do with it? Doesn't happen here.

FreakyFwoof,

@Bri Are you all up to date where that's concerned? I'm sure you are, but worth checking.

FreakyFwoof,

@Bri Anything in logs inder the .Ollama folder? Might be something to see there.

FreakyFwoof,

@Bri Not telling you anything you don't know already, you're more server'y than me but just trying to help lol

FreakyFwoof,

@pixelate @Bri I'm using VOLlama with NVDA and I just type something, wait for the clunk of received message, shift tab and read it. I never have issues like that.

FreakyFwoof,

@pixelate @Bri Well try Chi's instead. May work better.
https://github.com/chigkim/VOLlama/releases/

FreakyFwoof,

@pixelate @Bri Alpha 3 is what I'm using.

FreakyFwoof,

@pixelate @Bri I've got Llava:34B and Dolphin Mixtral installed, and it's on My M1 Max. I'm connecting to it over the network on windows, using VOLlama, my son's machine, my machine, and locally on Mac too, hasn't broken yet. I left it running overnight just with the window active, came back this morning, all here.

FreakyFwoof,

@pixelate @Bri 64 GB.

FreakyFwoof,

@JamminJerry Oh I know, but I'd like to use local resources anyway, because it's here, it's installed and I just want to see how well it does.

FreakyFwoof,

@Bri I can get the AI content describer to work by changing the url where it sends requests, butg I just have to install a model that now supports the Open API format, otherwise I get this:
error: HTTP Error 400: Bad Request. json: cannot unmarshal array into Go struct field Message.messages.content of type string

FreakyFwoof,

@Bri That's progress though. I know a little more now, than I did 5 minutes ago. I just went through every .py file looking for https requests to Open API and found that, so yeah. This non-programmer is trying. lol

FreakyFwoof,

@chikim It's called AI Content Describer. I tried modifying the python script and got errors, because I'm stupid and don't know what I'm doing.
error: HTTP Error 400: Bad Request. json: cannot unmarshal array into Go struct field Message.messages.content of type string

FreakyFwoof,

@GamingWithEars @pixelate @Bri Not really. 32's enough for most people. It's just if you're doing very specific tasks that you'll want more.

evilcookies98, to random
@evilcookies98@dragonscave.space avatar

I must confess that I cannot see a use for Apple Vision Pro for any sane human, unless they want to try to convince people of their total insanity by waving their hands around at objects no one else can see. It just seems like an overpriced paperweight to me.

FreakyFwoof,

@bryansmart @evilcookies98 I said the same thing about bluetooth, and even smartphones. They weren't normal, then they were. This conversation will seem so ridiculous in only 5 years from now, mark my words.

FreakyFwoof, to random

A long long time ago, @arfy made a lua script for Dolphin screen-readers that allowed you to type in plus or minus number of days and get the date. I just asked Dolphin Mixtral to do the same as an apple script using #Ollama running locally and it actually did it. It runs and works just as I wanted. Madness.

set numDays to text returned of (display dialog "Enter the number of days:" default answer "")
set targetDate to current date
set newDate to targetDate + numDays * days
display dialog "The future date will be: " & (newDate as string)

FreakyFwoof,

@tristan @sclower @arfy Running on my M1 Max, connecting over the network. My windows machine is 2012 vintage, nowhere near fast enough to do this meaingfully so I just don't bother.

FreakyFwoof,

@tristan @sclower @arfy No idea. Can't seem to find much in activity monitor. Never goes into swap though, but I have 64GB on this box.

FreakyFwoof, to random

What?...

You: How many years until 2024?
Llama2-uncensored: There are currently 16 years remaining until 2024, as it will begin on January 1st, 2024 and end on December 31st, 2024.
You: What year is it right now?
Llama2-uncensored: It is currently the year 2023.
You: If it's currently 2023, then it can't be 16 years until 2024.
Llama2-uncensored: You are correct in that 2024 is not 16 years away from the current year of 2023. However, I was referring to the number of years remaining until January 1st, 2024 when 2024 begins.

FreakyFwoof,

@sapphireangel Well exactly. haha

  • All
  • Subscribed
  • Moderated
  • Favorites
  • normalnudes
  • InstantRegret
  • thenastyranch
  • mdbf
  • Durango
  • Youngstown
  • slotface
  • hgfsjryuu7
  • vwfavf
  • rosin
  • kavyap
  • osvaldo12
  • PowerRangers
  • DreamBathrooms
  • anitta
  • magazineikmin
  • khanakhh
  • GTA5RPClips
  • ethstaker
  • cubers
  • ngwrru68w68
  • tacticalgear
  • everett
  • tester
  • Leos
  • cisconetworking
  • modclub
  • provamag3
  • All magazines