21113 (@21113) • Hey
21113 (@21113) • Hey
Publications
- 👀👀👀
- Go and grab your #Opensea mint 🥳🥳🎉🎉🚁🚁🚀🚀🛸🛸 LFG don’t miss eligible fam 🥳🥳🎉🎉🚁🚁🚀🚀🛸🛸🐳
#TaprootWizards #BTCNFTS #BTCNFT #WizardShower #AZUKI #NFT #NFTSART #LENS #BTC #ETH #NFTART #Phaver #ZERION #BAYC #MAYC #BAKC #BACC #WG0 #SewerPass
#LFG #WAGMI #AVAX #apefollowape #KoruDao
🥳🥳🎉🎉🚁🚁🚀🚀🛸🛸😇🙏🏻🧙
[Gemesis - Drop | OpenSea](https://opensea.io/collection/gemesis/drop)
Heralding the dawn of NFT power users, OpenSea Pro unlocks a new level of optionality, selection, and control for pro NFT collectors. Previously known as Gem, OpenSea Pro has been months in the making, culminating in the platform’s rebirth as the most powerful NFT marketplace aggregator.
Commemorating our community’s journey, we are releasing Gemesis, a thank you to Gem community members who have steered the ship with us. This limited-edition collection encapsulates our evolution, celebrates our community, and embodies the exciting road ahead.
[OpenSea Pro](https://pro.opensea.io)
- my Lens Circle
- are there kpop any fans on here
- a bit both since humans will become cyborgs
- Don’t miss #LiFi bridge 🥳🥳🎉🎉 Good fund raise 🚁🚁🚀🚀🛸🛸🧙♀️🧙♀️👀👀👀
#TaprootWizards #BTCNFTS #BTCNFT #WizardShower #AZUKI #NFT #NFTSART #LENS #BTC #ETH #NFTART #Phaver #ZERION #BAYC #MAYC #BAKC #BACC #WG0 #SewerPass
#LFG #WAGMI #AVAX #apefollowape #KoruDao
🤩🤩🐳🐳🎉🎉🚁🚁🚀🚀🛸🛸🥳🥳🧙♀️🧙♀️🧙♀️🧙♀️
[LI.FI - Powers any cross-chain strategy on Twitter](https://twitter.com/lifiprotocol/status/1643226950318837768)
“Exciting news! @lifiprotocol is happy to announce that we've raised $17.5M in a Series A round co-led by @coinfund_io and @superscrypt 💪”
- I’m in good company on #Lens 💚🌿✨
Nice work @juancito.lens
Check it out here https://circles.inlens.xyz/
- Lens社交底层可期
- 可以 在这薄荷,zk上的猴子https://zkape.io/
- 国美降息,香港合规,大饼减半,以太升级,l2撸毛暴富,下半年到明年可以让人想象一下,目前场外资金都还没进来,都是老人在玩
- 流量应用发展下去,eth,zks,arb,matic快速捕获用户,bsc要排老四老五了,没好应用,都是土狗,cz要开始急了
- 我看好eth uni aave bnb cake arb rdnt zks lens
- 我看好eth uni aave bnb aave arb rdnt zks lens
- 币安发布Binance DeFi Wallet,该钱包基于MPC技术开发,支持无助记词、钱包恢复等功能,并集成了Binance ID。目前支持BNB Chain和以太坊,未来将支持更多链。arb上线 币安慌了,好事情,加速行业发展,只有竞争才会推动行业快速发展
- 撸毛会带动提币运动,场外进场等,大量eth,arb,bnb,op等提到链上交互起来,交易所留存减少,场外资本进来,上涨格局形成,正循环形成,这来几个大的空投,撸毛热情会被彻底激活,排队进场的撸毛大军,直接推动钱包使用,web3.0的基础格局形成,这个时候腾讯会慌吗?当年腾讯做社交形成气候的时候是势如破竹的,web3.0的格局形成的时候,社交龙头透镜会形成势如破竹的格局吗
- GN🥃 all! Sorry for being absent here. My time has mostly been spent doing deep dives in AI and then trying to let my brain breathe from information overload. It’s not easy! Anyway, I wanted to share some general frameworks and info I think are crucial for navigating our future, yet hardly anybody is paying attention to right now in AI:
The Basics:
*************
1) LLMs (large language models like GPT4) don’t connect to the internet. They can be thought of as similar to CPUs. They do not store data, they are essentially zip files of rules/principles for calculating using logic and reasoning.
2) Computers up until this point have performed -Deterministic- compute.
2 + 2 = 4
A bit is a 1 or 0
LLMs perform -Probabilistic- compute.
2 + 2 is probably equal to 4-ish. Maybe 3, maybe 5.
It’s a fundamentally different kind of computation. Our brains do both types, and now computers can too!
The Cool Stuff:
****************
3) The cool stuff not enough people are talking about: Rebuilding “computers” with LLMs as the CPU:
If you download an LLM file and run it locally you might be disappointed to find it has no memory. It won’t remember your name and follow up questions will fail. Every question or statement starts the “conversation” over. It would be like throwing math questions at a CPU. It will compute each problem but with no memory to store the results. It’s rather uninteresting.
What makes ChatGPT so successful is that it re-feeds the entire printed conversation back into the LLM every time. It “fakes” a memory, but there are limits.
So how do we build LLMs with a memory? Well quite a few teams are working on building frameworks to chain things together, similar to how PC manufacturers assembled motherboards, graphics boards, hard drives, RAM, etc, and Apple/Microsoft/IBM developed Operating Systems.
Two teams I’ve really been intrigued by are LangChain and Pinecone. LangChain is a framework, or a way of putting components together…think of it as the full PC you might buy in a store. Pinecone is a vector database, which you can think of as a hard drive in this “PC.”
Components in this stack:
* Chains: The core of LangChain. Components (and even other chains) can be stringed together to create chains.
* Prompt templates: Prompt templates are templates for different types of prompts. Like "chatbot" style templates, ELI5 question-answering, etc
* LLMs: Large language models like GPT-3, GPT-4, LLaMA, LaMDA, BLOOM, etc
* Indexing Utils: Ways to interact with specific data (embeddings, vectorstores, document loaders)
* Tools: Ways to interact with the outside world (search, calculators, etc)
* Agents: Agents use LLMs to decide what actions should be taken. Tools like web search or calculators can be used, and all are packaged into a logical loop of operations.
* Memory: Short-term memory, long-term memory.
I’m going to leave it at this for now. If this has piqued your interest I’d recommend checking out this somewhat approachable introduction to LangChain and Pinecone:
Building The Future with LLMs, LangChain, & Pinecone
https://youtu.be/nMniwlGyX-c
- GN🥃 all! Sorry for being absent here. My time has mostly been spent doing deep dives in AI and then trying to let my brain breathe from information overload. It’s not easy! Anyway, I wanted to share some general frameworks and info I think are crucial for navigating our future, yet hardly anybody is paying attention to right now in AI:
The Basics:
*************
1) LLMs (large language models like GPT4) don’t connect to the internet. They can be thought of as similar to CPUs. They do not store data, they are essentially zip files of rules/principles for calculating using logic and reasoning.
2) Computers up until this point have performed -Deterministic- compute.
2 + 2 = 4
A bit is a 1 or 0
LLMs perform -Probabilistic- compute.
2 + 2 is probably equal to 4-ish. Maybe 3, maybe 5.
It’s a fundamentally different kind of computation. Our brains do both types, and now computers can too!
The Cool Stuff:
****************
3) The cool stuff not enough people are talking about: Rebuilding “computers” with LLMs as the CPU:
If you download an LLM file and run it locally you might be disappointed to find it has no memory. It won’t remember your name and follow up questions will fail. Every question or statement starts the “conversation” over. It would be like throwing math questions at a CPU. It will compute each problem but with no memory to store the results. It’s rather uninteresting.
What makes ChatGPT so successful is that it re-feeds the entire printed conversation back into the LLM every time. It “fakes” a memory, but there are limits.
So how do we build LLMs with a memory? Well quite a few teams are working on building frameworks to chain things together, similar to how PC manufacturers assembled motherboards, graphics boards, hard drives, RAM, etc, and Apple/Microsoft/IBM developed Operating Systems.
Two teams I’ve really been intrigued by are LangChain and Pinecone. LangChain is a framework, or a way of putting components together…think of it as the full PC you might buy in a store. Pinecone is a vector database, which you can think of as a hard drive in this “PC.”
Components in this stack:
* Chains: The core of LangChain. Components (and even other chains) can be stringed together to create chains.
* Prompt templates: Prompt templates are templates for different types of prompts. Like "chatbot" style templates, ELI5 question-answering, etc
* LLMs: Large language models like GPT-3, GPT-4, LLaMA, LaMDA, BLOOM, etc
* Indexing Utils: Ways to interact with specific data (embeddings, vectorstores, document loaders)
* Tools: Ways to interact with the outside world (search, calculators, etc)
* Agents: Agents use LLMs to decide what actions should be taken. Tools like web search or calculators can be used, and all are packaged into a logical loop of operations.
* Memory: Short-term memory, long-term memory.
I’m going to leave it at this for now. If this has piqued your interest I’d recommend checking out this somewhat approachable introduction to LangChain and Pinecone:
Building The Future with LLMs, LangChain, & Pinecone
https://youtu.be/nMniwlGyX-c
- Tomorrow is **Piano Day** and we're celebrating web3 style!
◼ March 29
◼ 8 composers on 1 Playlist Mint, including your Lens frens @ghosttrain.lens @adamprotz.lens @bobbyyaps.lens & @piano-kamilla.lens
◼ Open Edition for 1 week on @decentxyz
◼ 0.0329 eth
🎧 A little preview of my piece, Ephemera
- I'm saying my piece in the #futureofsocial by signing the @t2world.lens manifesto. Which values do you support?
https://manifesto-app.t2.world/share/clfycsac19762221fmcc6r0nrkj
- interesting concept, would you mind explain how the score is calculated?
- Just minted this adorable Lens Llamas NFT! 🎉🦙
These charming llamas are sure to make a delightful addition to any digital collection.
Thank you for this @cardenas
- 🤍
(JIC you recognise the collection, this is no endorsement of the collection nor financial advice (as always) - I literally know nothing about the creator or the collection)
- Verifying your job experience on-chain thanks to Request Network
- wavWRLD's Weekly Wavs 🌊🌐🎶 Issue #28
Ahhhhhh spring. Things are mellow right now. There’s a ton of great music dropping: of course on @soundxyz_.lens and @catalog.lens, but increasingly on @beatsapp.lens and @decentxyz.lens as well. Each platform offers something unique for creators, and the stronger of a cross-platform ecosystem we can build, the better-equipped artists will be to pursue their creative journeys. That’s a win-win.
We're so excited to bring our wavLETTERs to @lensprotocol 🌿 Comment your previous & upcoming Lens music drops below to be featured in the next one!
COLLECT FOR FREE - Must follow to collect
Read the full issue & subscribe here: https://open.substack.com/pub/wavwrld/p/wavwrlds-weekly-wavs-issue-28?r=1ml126&utm_campaign=post&utm_medium=web
#wavWRLD #wavLETTER #newsletter #musicnfts @creators @mixtape
- Weekend goals https://www.youtube.com/watch?v=uGLuiksGubg&t=30s
Happy Sunday everyone!
- Cinque Torri
testing lenshareapp.xyz by @nikoemme
- Hey everyone I’m super exited to announce that I joined Beats. @beatsxyz.lens
🌿🌿🌿
Beats is bringing music NFTs to Lens protocol and I'll be bringing some weirdness to your ears!
🌿🌿🌿
I'm releasing an experiment in short form music, minimalism, and synthesis called *Helicon Descending*
🌿🌿🌿
This album is 9 short songs released one song a week starting Monday, April 3rd.
There will be more details about the dates and times of the drops!
🌿Stay Tuned frens 👻🌿
Check out my Profile and get ready for some awesome music and exclusive content for my collectors.
https://www.beatsapp.xyz/profile/zombieshepherd.lens
- their algo is not that open, its more of showing how they connect different blackboxes together lol but I am fully for real open source algos
- Should lens follow the same algo?
- ELON MUSK Twitter algorithm will be released and open source today at noon Pacific Time.
- Paying for my drinks at our local brewery with Bitcoin. Local adoption of crypto is the way ❤️🔥
- EthTokyo roll call 🌸
- This has not passed any kind of governance.
- I am going to grow a beard until the next bull market
- or mint here
- Good morning 💙🌾💚🌿💛🌾💜🌿🤎
- Talked to two DeFi founders last week that said they both are not any more hiring in US because of the recent uncertainty. Very unfortunate as the US talent market has been strong for ages in web3
- lens bright as the sunshine
- What version are you on?