Explore
Connect with communities and discover new ideas.
Communities
Sui is a Layer 1 protocol blockchain designed as the first internet-scale programmable blockchain platform.
Top postsTop members- 458
- 426
- 408
Move is an executable bytecode language used to implement custom transactions and smart contracts.
Top postsTop members- 271
- 260
- 251
Web3 (also known as Web 3.0) is an idea for a new iteration of the World Wide Web which incorporates concepts such as decentralization, blockchain technologies, and token-based economics.
Top postsTop members- 397
- 193
- 141
The Graph is a decentralized protocol for indexing and querying blockchain data. The Graph makes it possible to query data that is difficult to query directly.
Top postsTop members- 2565
- 10
- 10
Aave is a decentralized non-custodial liquidity protocol where users can participate as depositors or borrowers.
Top postsTop members- 148
- 138
- 74
Peera is a decentralized questions and answers protocol for Web3 where users can organize and store their interests and skills, creating a common community platform
Top members- 328
- 286
- 225
Cyfrin Updraft is an education platform specializing on teaching the next generation of smart contract developers
Top members- 1780
- 75
- 60
The InterPlanetary File System (IPFS) is a protocol, hypermedia and file sharing peer-to-peer network for storing and sharing data in a distributed file system.
Top postsTop members- 25
- 20
- 20
Polygon is a decentralised Ethereum scaling platform that enables developers to build scalable user-friendly dApps with low transaction fees without ever sacrificing on security.
Top postsAnkr makes accessing Web3 easy for those who want to build and earn on the future web. Ankr is the main infrastructure provider for Polygon, BNB Smart Chain, and Fantom.
Top postsTop members- 89
- 43
- 34
Walrus is a decentralized storage and data availability protocol designed specifically for large binary files, or "blobs"
Top postsTop members- 41
- 40
- 38
Koii is a new way to design communications infrastructure that distributes computing authority across a wider group of personal devices.
Top postsTop members- 402
- 188
- 80
Functionland is replacing Cloud Storage and Service Subscription economy by introducing a new category of products, called Blockchain-Attached Storage. It creates value by auto-minting crypto for the users and allocating a share to the developers.
Solidity is an object-oriented, high-level language for implementing smart contracts. It is a curly-bracket language designed to target the Ethereum Virtual Machine (EVM).
Top postsTop members- 76
- 55
- 46
Fractal Visions is a builder owned and operated creative web3 NFT project hub and a multifaceted & multidimensional experience. Bridging the gap between the physical and digital world.
Top members- 30
- 27
- 23
- Top postsTop members
- 12
- 11
- 10
Vyper is a relatively new, pythonic programming language used to write smart contracts. Vyper targets Ethereum Virtual Machine making it virtually impossible for developers to code misleading programs.
Top members- 40
- 22
- 20
Bounty
- +15Xavier.eth301ForSuiJun 17, 2025
How do ability constraints interact with dynamic fields in heterogeneous collections?
I'm building a marketplace that needs to handle multiple asset types with different ability requirements, and I've hit some fundamental questions about Move's type system. I want to store different asset types in the same collection, but they have different abilities: Regular NFTs: key + store (transferable) Soulbound tokens: key only (non-transferable) Custom assets with transfer restrictions public struct Marketplace has key { id: UID, listings: Bag, // Want to store different asset types here } // This works for transferable assets public fun list_transferable( marketplace: &mut Marketplace, asset: T, price: u64 ) { /* ... */ } // But how to handle soulbound assets? public fun list_soulbound( // No store ability marketplace: &mut Marketplace, asset_ref: &T, // Can only take reference price: u64 ) { /* How do I store metadata about this? */ } Key Questions: Ability Requirements: When using dynamic_field::add(), does V always need store at compile time? Can wrapper types work around this? Heterogeneous Storage: Can a single Bag store objects with different ability sets (key + store + copy vs key + store), and handle them differently at runtime? Type Safety: Since dynamic fields perform type erasure, how do I maintain type safety when retrieving values? What's the pattern for storing type metadata? Witness Pattern: How do ability constraints work with phantom types? Can I store Asset and Asset in the same collection and extract type info later? Building a system where NFTs, soulbound tokens, and restricted assets all need marketplace functionality but with different transfer semantics. I’ve tried wrapper types, multiple collections per ability set, separate type metadata storage. Each has tradeoffs between type safety, gas costs, and complexity.
03 - +10ForSuiMay 29, 2025
Why does BCS require exact field order for deserialization when Move structs have named fields?
Why does BCS require exact field order for deserialization when Move structs have named fields? I've been diving deep into BCS encoding/decoding in Move, particularly for cross-chain communication and off-chain data processing. While working through the examples in the Sui Move documentation, I encountered some behavior that seems counterintuitive and I'm trying to understand the underlying design decisions. According to the BCS specification, "there are no structs in BCS (since there are no types); the struct simply defines the order in which fields are serialized." This means when deserializing, we must use peel_* functions in the exact same order as the struct field definition. My Specific Questions: Design Rationale: Why does BCS require exact field order matching when Move structs have named fields? Wouldn't it be more robust to serialize field names alongside values, similar to JSON or other self-describing formats? Generic Type Interaction: The docs mention that "types containing generic type fields can be parsed up to the first generic type field." Consider this structure: struct ComplexObject has drop, copy { id: ID, owner: address, metadata: Metadata, generic_data: T, more_metadata: String, another_generic: U } How exactly does partial deserialization work here? Can I deserialize up to more_metadata and ignore both generic fields, or does the first generic field (generic_data) completely block further deserialization? Cross-Language Consistency: When using the @mysten/bcs JavaScript library to serialize data that will be consumed by Move contracts, what happens if: I accidentally reorder fields in the JavaScript object? The Move struct definition changes field order in a contract upgrade? I have nested structs with their own generic parameters? Practical Implications: In production systems, how do teams handle BCS schema evolution? Do you version your BCS schemas, or is the expectation that struct field order is immutable once deployed?
53 - +10ForMoveMar 11, 2025
Sui Move vs Aptos Move - What is the difference?
Sui Move and Aptos Move - two prominent implementations of the Move programming language. While both are rooted in the same foundational principles, they have diverged significantly in design, execution, and ecosystem development. To better understand their differences, we need to uncover some of their key aspects: How do their runtimes differ? Both Sui and Aptos implement their own custom Move virtual machines (VMs). How does this impact performance, scalability, and developer experience? For instance: Does Sui's runtime optimize for parallel execution differently than Aptos'? Are there notable differences in transaction lifecycle management or gas models? What are the differences between their standard libraries? The Move standard library is a critical component for building smart contracts. However, Sui and Aptos have forked their implementations, leading to divergence: Are there modules or functions unique to one implementation but absent in the other? How do these differences affect common use cases like token creation, NFTs, or decentralized finance (DeFi)? How does data storage differ between them? One of the most significant distinctions lies in how Sui and Aptos handle data storage: Sui uses an object-centric model, where each object has its own ownership and permissions. Aptos, on the other hand, retains a more traditional account-based model similar to Ethereum. How does this impact state management, composability, and gas efficiency? Is it fair to say that Aptos is closer to EVM while Sui is closer to SVM? Some developers argue that Aptos' account-based architecture resembles Ethereum's EVM, while Sui's object-centric approach aligns more closely with Solana's SVM. Do you agree with this analogy? Why or why not? How does this architectural choice influence developer ergonomics and application design? Are there universal packages working for both Sui Move and Aptos Move? Given their shared origins, it would be ideal if some libraries or tools were interoperable across both ecosystems. Are there any existing universal packages or frameworks that work seamlessly on both platforms? If not, what are the main barriers to achieving compatibility? Can one of them be transpiled into another? If a project is built on Sui Move, could it theoretically be transpiled to run on Aptos Move, or vice versa? What are the technical challenges involved in such a process? Are there tools or compilers currently available to facilitate this kind of migration?
21
Newest
- ForSuiJun 19, 2025
Soulbound ও Transferable Asset
Great article! I’d like to add a few practical points to reinforce the design and type safety around heterogeneous assets in Sui Move: ✅ Since dynamic_field::add() requires the store ability, assets like soulbound tokens (which only have key) can’t be stored directly. Instead, store only their ID and listing metadata, which do have store. ✅ Best approach: separate collections based on ability constraints: VecMap → for key + store assets (e.g., transferable NFTs) VecMap → for key-only assets (e.g., soulbound tokens) ✅ Add a runtime asset_type: String tag to metadata. This enables you to identify and safely handle asset logic (e.g., transfers, display) even after type erasure. ✅ Phantom types are excellent for compile-time type tagging and preventing developer misuse (like accidental transfers of non-transferable tokens). This modular structure is scalable, avoids Move ability violations, and allows for flexible marketplace design without sacrificing safety. Great work on explaining it so thoroughly!
00 Building with Rust on Sui
I saw this repo recently when checking on the Mysten_Labs's GitHub: https://github.com/MystenLabs/move-binding Move Binding is a Rust library that provides a way to interact with Sui Move packages on-chain. It reads Move packages from the Sui blockchain and generates corresponding Rust structs and function entry points, allowing for seamless integration between Move and Rust. To use Move Binding in your project, add the following dependency to your Cargo.toml: [dependencies] move-binding-derive = { git = "https://github.com/MystenLabs/move-binding" } move-types = { git = "https://github.com/MystenLabs/move-binding" } `
21- harry phan458ForSuiJun 18, 2025
Building a Marketplace with Heterogeneous Assets
When building a marketplace on a blockchain using the Move programming language, one of the most intriguing challenges is managing assets with different ability constraints in a single collection. Whether you're dealing with transferable NFTs, non-transferable soulbound tokens, or custom assets with unique transfer restrictions, Move's strict type system demands careful design to ensure type safety and efficiency. In this post, we'll dive into how ability constraints interact with dynamic fields in heterogeneous collections, explore practical solutions, and share a robust approach to building a marketplace that handles diverse asset types. Understanding Move's Abilities Move, designed for blockchains like Sui and Aptos, uses abilities to define what operations a type supports. The two key abilities relevant to our marketplace are: key: Allows a type to be stored in global storage as an object. store: Permits a type to be embedded within another object, such as a struct or collection. Our marketplace needs to handle: Regular NFTs: Have key + store, making them transferable and storable. Soulbound Tokens: Have only key, meaning they’re non-transferable and cannot be stored in other objects. Custom Assets: Have varying abilities, potentially with transfer restrictions. The goal is to store these assets in a single collection, like a Bag, and manage their listings while respecting Move's type system. The Challenge: Dynamic Fields and Ability Constraints Move's dynamic_field::add() function allows adding fields to objects dynamically, which seems ideal for a heterogeneous collection. However, the value type V must have the store ability. This poses a problem for soulbound tokens, which lack store. So, how do we store and manage assets with different ability sets in a marketplace? Key Questions Does V in dynamic fields always need store? Can we use wrapper types to work around this? Can a single Bag store objects with different abilities (e.g., key + store vs. key)? How do we maintain type safety with dynamic fields' type erasure? How do phantom types and the witness pattern help manage heterogeneous assets? Let’s tackle each question and build a solution. Ability Requirements for Dynamic Fields The Move documentation confirms that dynamic_field::add() requires V to have the store ability. This is because dynamic fields are stored within an object, and Move enforces that embedded values must be storable. For regular NFTs with key + store, this is straightforward—we can store them directly in a Bag or dynamic field. For soulbound tokens with only key, direct storage is impossible. A wrapper type, like struct Wrapper has store { asset: T }, won’t work because T lacks store. Instead, we can store metadata, such as the asset’s ID and listing details, which does have store. For example: struct Metadata has store { id: ID, price: u64, asset_type: String, } Heterogeneous Storage in a Single Collection A Bag in Move is designed to store values with the store ability, but all values must conform to the same type constraints. This means a single Bag cannot store both NFTs (key + store) and soulbound token metadata unless they’re wrapped in a common type with store. However, mixing types in one collection often leads to complexity and potential type safety issues. A better approach is to use separate collections for different ability sets: Assets: Store the assets directly in a vector or Bag. Soulbound Tokens: Store their IDs and metadata in a vector. This separation respects Move’s ability constraints while keeping the system modular and maintainable. Maintaining Type Safety Dynamic fields erase type information at runtime, so retrieving a value requires specifying the type at compile time, like dynamic_field::remove(). This ensures type safety but complicates handling heterogeneous types. To manage different asset types, store a type tag (e.g., a String like "NFT" or "SoulboundToken") in the metadata. At runtime, check the tag to determine how to process the listing. For example: public struct ListingMetadata has store { asset_id: ID, price: u64, asset_type: String, } When retrieving, use the asset_type to decide whether to treat the asset as an NFT or soulbound token, ensuring correct handling while maintaining compile-time type safety for the stored metadata. Phantom Types and the Witness Pattern Phantom types in Move, like struct Asset, are useful for tagging different asset types without runtime overhead. For instance, you might define Asset and Asset to distinguish variants. However, the struct itself must still have store to be stored in a collection, and type information is erased at runtime. To extract type info later, store a metadata field like asset_type alongside the asset or its ID. This allows you to differentiate Asset from Asset during processing, such as when executing transfers or displaying listings. A Practical Marketplace Design Here’s a practical implementation for a marketplace that handles both transferable and soulbound assets: Marketplace Structure use sui::object::{Self, UID, ID}; use sui::vec_map::{Self, VecMap}; struct Marketplace has key { id: UID, transferable_listings: VecMap, soulbound_listings: VecMap, } struct ListingWithAsset has store { asset: T, // T must have key + store price: u64, } struct ListingMetadata has store { asset_id: ID, price: u64, asset_type: String, } Transferable Assets: public fun list_transferable( marketplace: &mut Marketplace, asset: T, price: u64 ) { let id = object::id(&asset); let listing = ListingWithAsset { asset, price }; vec_map::insert(&mut marketplace.transferable_listings, id, listing); } Soulbound Tokens: ublic fun list_soulbound( marketplace: &mut Marketplace, asset: &T, price: u64 ) { let id = object::id(asset); let listing = ListingMetadata { asset_id: id, price, asset_type: "SoulboundToken" }; vec_map::insert(&mut marketplace.soulbound_listings, id, listing); } Building a marketplace in Move taught me the importance of aligning with the language’s type system. Abilities like store are non-negotiable for dynamic fields, so planning for metadata storage early is critical for non-storable assets. Separate collections simplify handling different ability sets, while metadata tags enable runtime flexibility. Phantom types are great for compile-time distinctions, but runtime type handling requires explicit metadata.
1
Unanswered
- ForSuiJun 19, 2025
Soulbound ও Transferable Asset
Great article! I’d like to add a few practical points to reinforce the design and type safety around heterogeneous assets in Sui Move: ✅ Since dynamic_field::add() requires the store ability, assets like soulbound tokens (which only have key) can’t be stored directly. Instead, store only their ID and listing metadata, which do have store. ✅ Best approach: separate collections based on ability constraints: VecMap → for key + store assets (e.g., transferable NFTs) VecMap → for key-only assets (e.g., soulbound tokens) ✅ Add a runtime asset_type: String tag to metadata. This enables you to identify and safely handle asset logic (e.g., transfers, display) even after type erasure. ✅ Phantom types are excellent for compile-time type tagging and preventing developer misuse (like accidental transfers of non-transferable tokens). This modular structure is scalable, avoids Move ability violations, and allows for flexible marketplace design without sacrificing safety. Great work on explaining it so thoroughly!
00 How to update a merchant's key in ObjectTable when it changes in the struct?
Hi everyone, I'm just getting started with writing smart contracts and I'm working on my very first project. I'd love some help with an issue I'm stuck on. So far, I’ve created a Merchant struct that looks like this: id: a unique identifier (UID) owner: the address of the merchant key: a String used as a unique key balance: a u64 representing their balance I also made a MerchantRegistry struct to manage all merchants: id: another UID merchant_to_address: an ObjectTable mapping addresses to merchants merchant_to_key: an ObjectTable mapping keys to merchants I want to be able to look up a merchant either by their address or by their key. When a user updates their key inside the Merchant struct, the change doesn’t automatically update the key in the merchant_to_key table. That means the old key still points to the merchant, which breaks things. I tried removing the entry from the table and inserting it back with the new key, but I keep running into errors like: "Cannot ignore values without drop ability" I'm pretty sure this is a beginner mistake, but I haven't been able to find a clear explanation or solution anywhere. Is there a proper way to handle updating the key in both the struct and the lookup table?
20- 0xduckmove408ForSuiJun 06, 2025
Whats the easiest frontend to upload walrus blobs?
just a simple ui to upload to walrus? (besides tusky)
10
Trending
- 0xduckmove408ForSuiApr 08, 2025
👀 SEAL- I Think Web3 Data Privacy Is About to Change
👀SEAL is Live on Sui Testnet – I Think Web3 Data Privacy Is About to Change In the Web3, it’s common to hear phrases like “users own their data” or “decentralized by design”. But when you look closely, many applications still rely on centralized infrastructures to handle sensitive data — using services like AWS or Google Cloud for key management. This introduces a contradiction: decentralization on the surface, centralization underneath. But what if there was a way to manage secrets securely, without giving up decentralization?Introducing SEAL – Decentralized Secrets Management (DSM), now live on the Sui Testnet. SEAL aims to fix one of Web3’s biggest hypocrisies: shouting decentralization while secretly using AWS You maybe ask me: What is SEAL? SEAL is a protocol that lets you manage sensitive data securely and decentrally – built specifically for the Web3 world. Think of it as a privacy-first access control layer that plugs into your dApp. You can think of SEAL as a kind of programmable lock for your data. You don’t just lock and unlock things manually — you write policies directly into your smart contracts, using Move on Sui. Let’s say you’re building a dApp where: Only NFT holders can unlock a premium tutorial Or maybe a DAO has to vote before sensitive files are revealed Or you want metadata to be time-locked and only accessible after a specific date SEAL makes all of that possible. The access control lives onchain, fully automated, no need for an admin to manage it. Just logic, baked right into the blockchain. SEAL makes all of that possible. The access control lives onchain, fully automated, no need for an admin to manage it. Just logic, baked right into the blockchain. Another interesting piece is how SEAL handles encryption. It uses something called threshold encryption, which means: no single node can decrypt the data. It takes a group of servers to work together — kinda like multi-sig, but for unlocking secrets. This distributes trust and avoids the usual single-point-of-failure problem. And to keep things truly private, SEAL encrypts and decrypts everything on the client side. Your data is never visible to any backend. It stays in your hands — literally — on your device. and SEAL doesn’t care where you store your data. Whether it’s IPFS, Arweave, Walrus, or some other platform, SEAL doesn’t try to control that part. It just focuses on who’s allowed to see what, not where things are stored. So yeah, it’s not just a library or API — it’s an onchain-first, access-controlled, privacy-by-default layer for your dApp. SEAL fills a pretty critical gap. Let’s break that down a bit more. If you’re building a dApp that deals with any form of sensitive data — gated content, user documents, encrypted messages, even time-locked NFT metadata — you’ll run into the same problem: ➡️ How do you manage access securely, without relying on a centralized service? Without something like SEAL, most teams either: Use centralized tools like AWS KMS or Firebase, which clearly goes against decentralization Or try to patch together half-baked encryption logic themselves, which usually ends up brittle and hard to audit https://x.com/EmanAbio/status/1908240279720841425?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1908240279720841425%7Ctwgr%5E697f93dc65359d0c8c7d64ddede66c0c4adeadf1%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.notion.so%2Fharryph%2FSEAL-Launches-on-Sui-Testnet-1cc4f8e09bb380969c0dcc627b96cc22 Neither of those scales well. Especially not when you’re trying to build trustless apps across multiple chains or communities. SEAL makes that entire process modular and programmable. You define your access rules in Move smart contracts, and SEAL handles the rest — key generation, decryption approvals, and access enforcement — all without anyone manually issuing keys or running backend checks. Even better, those rules are auditable and immutable — once they’re onchain, they follow the contract, not a human admin. So instead of asking “who should manage access to this data?” you just ask: “What logic should define access?” …and let the chain handle it. Clean and scalable. That’s what makes SEAL relevant for more than just “security tools” — it’s a base layer for any dApp that cares about privacy, compliance, or dynamic access logic. It’s a small shift — but it changes a lot about how we think of data in Web3. Instead of encrypting after deployment, or relying on external services, you start with privacy built-in — and access handled entirely by smart contract logic. And that’s exactly what Web3 needs right now. How Does SEAL Actually Work? We’ve covered what SEAL is and why Web3 needs it, let’s take a look at how it’s actually built under the hood. This part is where things get more technical — but in a good way. The architecture is elegant once you see how all the pieces fit together. At a high level, SEAL works by combining onchain access logic with offchain key management, using a technique called Identity-Based Encryption (IBE). This allows devs to encrypt data to an identity, and then rely on smart contracts to define who is allowed to decrypt it. Step 1: Access Rules in Smart Contracts (on Sui) Everything starts with the smart contract. When you’re using SEAL, you define a function called seal_approve in your Move contract — this is where you write your conditions for decryption. For example, here’s a simple time-lock rule written in Move: entry fun seal_approve(id: vector, c: &clock::Clock) { let mut prepared: BCS = bcs::new(id); let t = prepared.peel_u64(); let leftovers = prepared.into_remainder_bytes(); assert!((leftovers.length() == 0) && (c.timestamp_ms() >= t), ENoAccess); } Once deployed, this contract acts as the gatekeeper. Whenever someone wants to decrypt data, their request will get checked against this logic. If it passes, the key gets released. If not, they’re blocked. No one has to intervene. Step 2: Identity-Based Encryption (IBE) Here’s where the magic happens. Instead of encrypting data for a specific wallet address (like with PGP or RSA), SEAL uses identity strings — meaning you encrypt to something like: 0xwalletaddress dao_voted:proposal_xyz PkgId_2025_05_01 (a timestamp-based rule) or even game_user_nft_holder When the data is encrypted, it looks like this: Encrypt(mpk, identity, message) mpk = master public key (known to everyone) identity = the logic-defined recipient message = the actual data Later, if someone wants to decrypt, the key server checks if they match the policy (via the seal_approve call onchain). If it’s approved, it returns a derived private key for that identity. Derive(msk, identity) → sk Decrypt(sk, encrypted_data) The user can then decrypt the content locally. So encryption is done without needing to know who will decrypt ahead of time. You just define the conditions, and SEAL figures out the rest later. It’s dynamic. Step 3: The Key Server – Offchain, But Not Centralized You might wonder: who’s holding these master keys? This is where SEAL’s Key Server comes in. Think of it as a backend that: Holds the master secret key (msk) Watches onchain contracts (like your seal_approve logic) Only issues derived keys if the conditions are satisfied But — and this is key — SEAL doesn’t rely on just one key server. You can run it in threshold mode, where multiple independent servers need to agree before a decryption key is issued. For example: 3-of-5 key servers must approve the request. This avoids central points of failure and allows decentralization at the key management layer too. Even better, in the future SEAL will support MPC (multi-party computation) and enclave-based setups (like TEE) — so you can get even stronger guarantees without compromising usability. Step 4: Client-Side Decryption Once the key is returned to the user, the actual decryption happens on their device. This means: The server never sees your data The backend never stores decrypted content Only the user can access the final message It’s a solid privacy model. Even if someone compromises the storage layer (IPFS, Arweave, etc.), they still can’t read the data without passing the access logic. Here’s the quick mental model: This structure makes it easy to build dApps where access rules aren’t hardcoded — they’re dynamic, auditable, and fully integrated into your chain logic. The Team Behind SEAL SEAL is led by Samczsun, a well-known figure in the blockchain security community. Formerly a Research Partner at Paradigm, he has audited and saved multiple ecosystems from major exploits. Now, he’s focused full-time on building SEAL into a core piece of Web3’s privacy infrastructure. With his background and credibility, SEAL is not just another experimental tool — it’s a serious attempt at making decentralized data privacy both practical and scalable. As SEAL goes live on the Sui Testnet, it brings a new standard for how Web3 applications can manage secrets. By combining onchain access control, threshold encryption, and client-side privacy, SEAL offers a more trustworthy foundation for decentralized data handling. Whether you’re building dApps, DAOs, or decentralized games — SEAL provides a powerful toolkit to enforce access control and protect user data without compromising on decentralization. If Web3 is going to move forward, secure infrastructure like SEAL is not optional — it’s essential
8 AMM Bot in Sui Ecosystem
What are the key features and functionalities of AMM bots within the Sui ecosystem? How do they improve upon traditional trading mechanisms, and what advantages do they offer to users engaging with DeFi protocols on the Sui network? Do I need to build one or I can use Turbos Finance for example
62- Foksa225ForPeera MetaJul 25, 2022
How to create an account
To create an account on the Peeranha website you need to log in first. Peeranha supports four wallets you can log in with: Torus, MetaMask, WalletConnect, and Coinbase. Note: If you are new to Web3 we recommend you to use Torus. Log in with Torus Click Log in. Select the checkbox I agree to the Terms and Conditions and Privacy Policy. Choose Torus in the pop-up window. Select any of the suggested ways you want to sign in or enter your email address. You are now logged in and have an account. In your profile section, you can add more detailed information about yourself and track your progress and achievements. To edit your profile, click Profile and select Edit. Enter all the information that you want to be visible in your profile.
5