127-0-0-1_1
127-0-0-1_1 t1_jdvu1b9 wrote
Reply to comment by LordOfTheDerp in First Citizens Bank to purchase assets of Silicon Valley Bank by Substantial-Pass-992
How do you think they’re going to get a higher interest rate for you? This is practically free money. Not only do they get 16.5b in arbitrage if they can maintain depositor liquidity, but the latter is extremely easy now that the fed is offering collateralized loans for federal bond assets at cost.
127-0-0-1_1 t1_jcqd8se wrote
Reply to comment by KerfuffleV2 in [D] PyTorch 2.0 Native Flash Attention 32k Context Window by super_deap
It's not unlimited memory in a single run, which remains unchanged, but that doesn't seem super relevant to what people want (nothing wrong with multiple runs!). Think about a turing machine, or heck, yourself. A turing machine only has access to a single cell of memory at at time, and in practice, modern CPUs only have access to their registers directly. For long term storage, that goes into RAM, which is accessed on demand.
Similarly, your own memory is not large enough to contain all the information you'd need to complete most complex tasks. That's why you have to write things down and actively try to remember things.
While that uses OpenAI's embedding networks, like the autoregressive LLM itself, it's not like OpenAI has a monopoly on text embeddings by any means (far from it - embeddings have a very straightforward business use and are used in practically any major site you know for things like similarity queries).
While I think OP is overhyping the degree to which this is "infinite memory" yet, in a hypothetical turing machine formulation where the network can more proactively store and restore memory, it would allow for it to be, at least, turing complete.
127-0-0-1_1 t1_jc4umjb wrote
Reply to comment by big_ol_tender in [D] ChatGPT without text limits. by spiritus_dei
How are they going to take your ball away? By having a nicer ball?
Of course you, alone, is going to produce worse products than a bunch of postdoctorates with the budget of a small nation state.
127-0-0-1_1 t1_j6ocy0u wrote
Reply to Salem tips for February day trip? by elevenTsix
Was just there. A lot of the shops are closed because “seasonal”, but there’s no lack of witch/occult themed shops open still. There’s a section of town with old brick road that has a lot of the shops which fits that vibe. Peabody was great, although somewhat discontinuous with the rest of the town. Lots of interesting Asian exhibits. It’s a huge, modern museum. Nothing witchy, though.
Honestly the witch trial sites themselves are not particularly interesting. It’s mostly old buildings, some of which have modern residents, the memorial, which isn’t very notable, and a cemetery with some somewhat notable people.
127-0-0-1_1 t1_jebcqhc wrote
Reply to comment by Hashtagworried in Here’s What Happened When ChatGPT Wrote to Elected Politicians - Cornell researchers used artificial intelligence to write advocacy emails to state legislators. The responses don’t bode well for democracy in the age of A.I. by speckz
They were being facetious, not seriously saying that they were using AI.