Oh yes this. Not using AI is best but if you must: Keep putting it on some random corpo tab and bankrupt them if they’re going to insist on having AI ‘help bots’
These “AI” grifters are burning down the planet, computing the average of the entire internet, so that people can order burgers with a chatbot.
I haven’t kept up with the prompt injection memes, but I wonder if anyone bothered making a frontend or Ollama plugin that just uses the thousands of random public chatbots to actually accomplish a free high token rate usage lol.
Can we make them chat with each other?
Let’s get McDonald’s support to make it
Writing a linked list in Python just feels wrong. I know it’s probably homework.
If you must, stdlib has dequeue which is going to be faster and simpler than rolling your own. Plus no reference semantics to deal with.
It’s a meme job interview question iirc.
Yeah, linked lists are rarely a good idea. Modern memory optimization, where contiguous regions of memory are loaded into CPU caches, means that array-backed lists have better performance in virtually all situations.
In a way, I’d want to argue that you should actually only ever roll your own linked lists, because you should only use linked lists when you’re not working in-memory, i.e. when array-backed lists are not an option to begin with.
im not sure how
malloc()works, but I would guess it would attempt to squeeze new allocations into partially-filled memory pages, right? Wouldn’t that largely offset the inefficiency?Transfers to cache get brought in much smaller sizes than a memory page so I can’t imagine an automated treatment like that from the heap end would be effective. However there are ways to lay out linked lists in memory so that it is more friendly to cache
You really need frequent middle insertion (insert joke here) for the linked list to become better than an array list.
Oh damn, I was just about to reply to your reply to @fishface@piefed.social (which is literally directly above this comment, on my screen) suggesting exactly this. Glad that Piefed initially failed to register my clicking “reply”.
What would you use if you don’t know how much space you were going to need in advance, and you were gonna only read the data once for every time the structure got created.
Array list/vector types often have dynamic resize built in, and then if you can benchmark it that always helps.
Yes, but dynamic resize typically means copying all of the old data to the new destination, whereas a linked list does not need to do this. The time complexity of reading a large quantity of data into a linked list is O(N), but reading it into an array can end up being O(N^2) or at best O(N log N).
You can make the things in your list big chunks so that you don’t pay much penalty on cache performance.
I thought of another good example situation: a text buffer for an editor. If you use an array, then on large documents inserting a character at the beginning of the document requires you to rewrite the rest of the array, every single character, to move everything up. If you use a linked list of chunks, you can cap the amount of rewriting you need to do at the size of a single chunk.
Expanding a dynamic array to powers of 2 has amortized constant complexity so filling one up from empty is O(n).
Well I just had to work it out again myself and you’re right. I dunno what scenario I was thinking of that had worse complexity and whether it was really due to dynamic arrays; I just remember getting asked about it in some interview and somehow the answer ended up being “use a linked list and the time complexity goes down to linear” /shrug
Thanks for the correction!
I’m surprised they went for such a high performance chat bot as Claude
Paying for a needlessly expensive llm model makes leadership think they’re “investing in inovation”
Is this even a real screenshot? Seems fake to me. The bot on the support page I tried is much more dumb and doesn’t seem LLM based at all. More like the chat bots we had before LLMs fucked up the world.
It’s a good question and I feel like this is an older screenshot? But it could be faked super easily too
They fired their IT manager and hired AI agent.
How can you be sure that they do?
Could be using a cheaper model like haiku, which is still “claude”
Personally, I’ve been using Ecosia as a coding assistant for shit my local LLM on my SteamDeck can’t figure out without triggering thermal threshold warnings.
Its… theoretically more eco friendly than uh, Ronald McDonald teaches Python.
Wait, Ecosia has an AI assistant? One - that sounds ridiculously counterproductive to their goal of being eco, Two - I need to check that out haha
It has an AI assisted search feature.
Which just also happens to have basically no guidelines preventing it from analyzing and rewriting code that you copy paste to it.
lol.
But uh, they have some kind of scheme involving Ai usage or search queries = pledge to plant trees.
… ???
At least with most models I’ve run locally, they’ll all at least try to read code, tell you how it works, suggest improvements, etc.
You could probably just copy paste a block of random code into any Ai ‘thing’, and ask it to help you with it, and theres a pretty good chance it would try to.
Bro Grimmace is just 1337
Isn’t there a free claude option?
Yeah, but McDonald’s isn’t paying for those tokens.
Add whitespace to taste
ChatGPT is also free.
At least until their subscription model implodes
Would.like to see the original image







