@dimasandrianp_: ⭐⭐⭐ #foryoupage #rowojombor #klaten24jam

𝕳𝖆𝖑𝖑𝖔𝕲𝖊𝖓✨
𝕳𝖆𝖑𝖑𝖔𝕲𝖊𝖓✨
Open In TikTok:
Region: ID
Sunday 09 March 2025 02:21:10 GMT
37006
1988
36
149

Music

Download

Comments

saputroaldi3
@anak e mak sri01 :
rowo⭐⭐⭐⭐⭐
2025-03-09 17:25:44
0
ahmad.sayang_
Gabutttygy :
Untung aku rametu 🗿
2025-03-09 08:15:04
5
supermanja83
🌀 :
seng isuk mau
2025-03-09 09:29:22
5
bukan_pahlawan177
hann~ :
untung mau ra sido rono 🤣
2025-03-09 14:26:23
14
msprtma
ⓅⓇⓉⓂ :
panik ga,panik lah😜
2025-03-09 10:14:54
2
vanszwhy
𝙈𝙍𝙆𝙕.𝘾𝙊 :
Hayo loo🤭😅
2025-03-09 02:46:07
2
tirrdesain453
Wesmbuh. :
untong Ra kecekel aku kang 🗿
2025-03-09 16:40:36
1
dann_epep
𝕯𝖆𝖓𝖉𝖎𝖎 :
untung aku wes muleh kang 🗿
2025-03-09 13:59:20
1
13erteman
JO :
mending mancing😂
2025-03-09 12:07:09
1
scaacqb
LkzCuexx🆒 :
tak dapat tangkap tak dapat tangkap😹
2025-03-09 11:31:36
1
gansdos
arayyyn :
Awas akamsi mas😭🙏
2025-03-09 09:45:44
1
gndtblmsmbuh
𝙂𝙉𝘿𝙏 ⚡ :
Lari police😭
2025-03-09 09:28:23
1
diksedangmandi
dik2nd⚡ :
rowo saiki rame
2025-03-10 06:00:54
0
gatau88.0
gatau :
ono op bng?
2025-03-09 18:47:53
0
dindaevitasari0
S.A.D.E.S :
Untung e ora Sido meh metu neng Rowo kunci e malah hilang 😭🙏
2025-03-09 15:49:05
0
wenzz2910
wenzz :
nng nduwur Ono videone sek nng mburi
2025-03-09 15:43:38
0
https.chromefaze
TodzzGokk :
untung mau Sido ne neng clongopp
2025-03-09 14:43:25
0
dhansuwe1
🚀🐊 :
polisine bar ceramah ng trucuk
2025-03-09 14:22:47
0
luwak_official
CAH DESO OFFICIAL :
mentahan e mas
2025-03-09 14:09:13
0
azka4ever
AZKA :
😁
2025-03-10 04:22:53
0
azka4ever
AZKA :
😂
2025-03-10 04:22:53
0
menolakboncoss
fishing anti boncos :
⭐⭐⭐⭐⭐⭐
2025-03-09 20:31:07
0
langgeng1236
MAS BON 1928 :
😊
2025-03-09 16:16:55
0
mutiaism__
mutiaaaa :
gagal giwar"😂
2025-03-09 03:56:53
3
kenthung05
iyadera :
kopdar ae
2025-03-09 03:15:03
3
To see more videos from user @dimasandrianp_, please go to the Tikwm homepage.

Other Videos

AI’s Memory Problem Today, I’m diving into one of the most pressing challenges for large language models (LLMs): memory. These models are incredibly intelligent, but their memory is shockingly short. Think of it as having a genius assistant who forgets your last conversation after 10 minutes. The best models today, like O1 Pro, boast 200,000 tokens of memory, but even that’s just a fraction of what’s needed for meaningful, long-term interactions. Here’s the kicker: solving this isn’t cheap. I crunched the numbers in my Substack, and providing even basic long-term memory for a platform like ChatGPT—serving 125 million daily users—would cost over half a trillion dollars annually. And that’s not for human-level memory; that’s just a few months of retention. The real question is: how do these memory limits shape what kinds of problems we solve? What are we inherently limited from tackling because our AI thinking partners can’t remember beyond a few exchanges? Even more fascinating, how does this shape our memory? There’s anecdotal evidence of people adjusting their thinking and vocabulary because of how they interact with LLMs. Are we adapting to their limits, or will these limitations drive the next breakthrough in AI? I believe memory will dominate AI discussions in 2025, and it’s a topic we need to address now. If this resonates, check out my Substack for a deeper dive, and let’s keep the conversation going. Hashtags: #product #productmanager #productmanagement #startup #business #openai #llm #ai #microsoft #google #gemini #anthropic #claude #llama #meta #nvidia #career #careeradvice #mentor #mentorship #mentortiktok #mentortok #careertok #job #jobadvice #future #2024 #2025 #story #news #dev #coding #code #engineering #engineer #coder #sales #cs #marketing #agent #work #workflow #smart #thinking #strategy #cool #real #jobtips #hack #hacks #tip #tips #tech #techtok #techtiktok #openaidevday #aiupdates #techtrends #voiceAI #developerlife #cursor #replit #pythagora #bolt #chatgpt #memory #question #future
AI’s Memory Problem Today, I’m diving into one of the most pressing challenges for large language models (LLMs): memory. These models are incredibly intelligent, but their memory is shockingly short. Think of it as having a genius assistant who forgets your last conversation after 10 minutes. The best models today, like O1 Pro, boast 200,000 tokens of memory, but even that’s just a fraction of what’s needed for meaningful, long-term interactions. Here’s the kicker: solving this isn’t cheap. I crunched the numbers in my Substack, and providing even basic long-term memory for a platform like ChatGPT—serving 125 million daily users—would cost over half a trillion dollars annually. And that’s not for human-level memory; that’s just a few months of retention. The real question is: how do these memory limits shape what kinds of problems we solve? What are we inherently limited from tackling because our AI thinking partners can’t remember beyond a few exchanges? Even more fascinating, how does this shape our memory? There’s anecdotal evidence of people adjusting their thinking and vocabulary because of how they interact with LLMs. Are we adapting to their limits, or will these limitations drive the next breakthrough in AI? I believe memory will dominate AI discussions in 2025, and it’s a topic we need to address now. If this resonates, check out my Substack for a deeper dive, and let’s keep the conversation going. Hashtags: #product #productmanager #productmanagement #startup #business #openai #llm #ai #microsoft #google #gemini #anthropic #claude #llama #meta #nvidia #career #careeradvice #mentor #mentorship #mentortiktok #mentortok #careertok #job #jobadvice #future #2024 #2025 #story #news #dev #coding #code #engineering #engineer #coder #sales #cs #marketing #agent #work #workflow #smart #thinking #strategy #cool #real #jobtips #hack #hacks #tip #tips #tech #techtok #techtiktok #openaidevday #aiupdates #techtrends #voiceAI #developerlife #cursor #replit #pythagora #bolt #chatgpt #memory #question #future

About