How to Build a performant and scalable Full Stack NFT Marketplace
By adding an Indexing middleware
This article branches off from Nader Dabit's fantastic walkthrough on how to build a Full Stack NFT Marketplace.
I give Nader credit for the idea, and I consider this project a homage to his work.
Introduction
I was inspired into making this video after watching Nader's video on his YouTube channel. It's a proof of concept, functional NFT marketplace, and a great example of a full stack DApp, including a smart contract, as well as frontend.
When I saw it, because of what I do on a daily basis, I instantly noticed something: it is using direct calls to the contract to gather data for the frontend. As a matter of fact, it actually has some custom functions in the smart contract that have been developed specifically for this purpose.
Talking to DApp devs in the past year, I know this is actually quite common, especially in the early stages of development. You basically need to build the frontend and you need a data source: what better data source than the contract itself? It's relatively simple, as the contract is already there, and does not require additional components.
But this actually does not hold up in the long run. The performance of direct RPC calls is bad, it does not scale and so once the contract is deployed to mainnet (and in some cases even earlier, on testnet) you start to notice that it is necessary to add an indexing middleware. This requires rebuilding a decent chunk of the frontend, because instead of relying on the smart contract calls, the dApp now performs requests to a web service. This means duplicating the efforts of frontend developers, it's wasted time, and as we all know, time is money.
But developers are "forced" to do this because there are no easy alternatives. Or there were, and it's what I am going to show in this article.
Project and setup
I have forked Nader's repository in my own GitHub account, you can check it out here:
RaekwonIII/subsquid-ethereum-nextjs-marketplace
In the main
branch you can find the end result, but if you want to follow along, with the article, I advise you to clone the repository and checkout the workshop-start
branch, which has a couple of necessary changes. For example, at the top of the create-nft.js
file, instead of the basic client:
// pages/create-nft.js
const client = ipfsHttpClient('https://ipfs.infura.io:5001/api/v0')
I had to add code to provide authentication when uploading images to IPFS, because Infura node requires it.
// pages/create-nft.js
const auth =
'Basic ' + Buffer.from(process.env.NEXT_PUBLIC_IPFS_PROJECT_ID + ':' + process.env.NEXT_PUBLIC_IPFS_PROJECT_SECRET).toString('base64');
const client = ipfsHttpClient({
host: 'infura-ipfs.io',
port: 5001,
protocol: 'https',
headers: {
authorization: auth,
},
});
I have also added a docker-compose.yml
file and a squid
folder at the root of the project, but we are going to talk about this in the next chapter.
Small changes aside, this represents a snapshot of Nader's project at the end of his walkthrough. We can launch the project with a series of commands. Run hardhat node:
npx hardhat node
Then deploy the contract (in a different terminal):
npx hardhat run scripts/deploy.js --network localhost
And finally run the app:
npm run dev
When we visit the home page, it is loading the NFTs up for sale, using a special smart contract function called fetchMarketItems()
:
// pages/index.js
async function loadNFTs() {
/* create a generic provider and query for unsold market items */
const provider = new ethers.providers.JsonRpcProvider()
const contract = new ethers.Contract(marketplaceAddress, NFTMarketplace.abi, provider)
const data = await contract.fetchMarketItems()
// ...
}
Which we can find in the solidity code. The original video even shows how to code this, but in short, it's doing what the comment says: it's returning unsold items:
//contracts/NFTMarketplace.sol
/* Returns all unsold market items */
function fetchMarketItems() public view returns (MarketItem[] memory) {
uint itemCount = _tokenIds.current();
uint unsoldItemCount = _tokenIds.current() - _itemsSold.current();
uint currentIndex = 0;
MarketItem[] memory items = new MarketItem[](unsoldItemCount);
for (uint i = 0; i < itemCount; i++) {
if (idToMarketItem[i + 1].owner == address(this)) {
uint currentId = i + 1;
MarketItem storage currentItem = idToMarketItem[currentId];
items[currentIndex] = currentItem;
currentIndex += 1;
}
}
return items;
}
The same goes for the dashboard
and my-nfts
pages, where the same initialization is done. It's using a different function, with a different logic, but the concept remains the same.
Local development, local indexer
One of the main changes I have done in this branch was adding a squid
subfolder, which contains a template of a squid data indexer, and by the end of this article, this is going to be my data source for the Next.js frontend.
This squid indexer relies on another component called Archive, and this is where the docker-compose.yml
file in the root folder comes into play. You can read more about this in our docs, but Archives ingest raw blockchain data, which you can then further transform and customize in your own squid, which is essentially an ETL pipeline.
This file defines two services, called ingester
and worker
, that ingest data from the node, normalize it and present it via a REST API. I have also created my own images, and hosted them in my own registry, because the Ethereum Archive is being actively developed at the moment, and as soon as it stabilizes, Subsquid will likely provide public images.
A quick note about indexing Hardhat, is that the local node's default configuration only "mines" new blocks, when there is an interaction with the chain, like a smart contract function call. To make it easier to follow the Archive ingesting, as well as the squid indexing, I have changed, located in hardhat.config.js
. I took the chance to also add configuration to deploy on Goerli network:
// hardhat.config.js
require("@nomiclabs/hardhat-waffle");
const fs = require('fs');
const infuraId = fs.readFileSync(".infuraid").toString().trim() || ""; // process.env.INFURA_NODE_ID;//
const account = fs.readFileSync(".account").toString().trim() || "";
module.exports = {
defaultNetwork: "hardhat",
networks: {
hardhat: {
chainId: 1337,
mining: {
auto: false,
interval: [4800, 5200]
}
},
goerli: {
// Infura
url: `https://goerli.infura.io/v3/${infuraId}`,
accounts: [account]
},
/*
matic: {
// Infura
// url: `https://polygon-mainnet.infura.io/v3/${infuraId}`,
url: "https://rpc-mainnet.maticvigil.com",
accounts: [process.env.privateKey]
}
*/
},
solidity: {
version: "0.8.4",
settings: {
optimizer: {
enabled: true,
runs: 200
}
}
}
};
I did not find a more elegant way to read
INFURA_NODE_ID
, or the private key of the throw-away account I'll be using for testnet from the environment variables, so I resorted to writing them down into.infuraid
and.account
files, which are not part of the repository
With that being said, let's get back to our terminal and launch the Archive services, running the command:
docker compose up -d
Now we can start the squid development.
Squid development
In order for us to use the squid indexer as our data source, we need to customize the template to our needs and launch it.
Schema
The first thing to do is to define the schema, which determines both the database and API structure.
In order for our frontend to access the data it needs, we need to keep track of all the Tokens
created and its metadata.
I also need to track the Owner
of an NFT.
The Contract
entity is something I usually have when I index NFTs from multiple collections. For the sake of this article, I am going to keep it, but I am actually going to hard-code most of the information for it, for simplicity, as it's not very relevant to the purpose of this project.
Lastly, every time an NFTs is sold, we are going to create a new Transfer
object, which is keeping track, for example, of the historical price of every transfer.
Here's how the squid/schema.graphql
file should look like:
type Token @entity {
id: ID!
owner: Owner
uri: String
transfers: [Transfer!]! @derivedFrom(field: "token")
contract: Contract
name: String,
description: String
imageURI: String
price: BigInt!
forSale: Boolean
}
type Owner @entity {
id: ID!
ownedTokens: [Token!] @derivedFrom(field: "owner")
}
type Contract @entity {
id: ID!
name: String! @index
symbol: String! @index
# contract URI updated once e.g. a day
contractURI: String
address: String
# timestamp when the contract URI was updated last
contractURIUpdated: BigInt @index
totalSupply: BigInt!
mintedTokens: [Token!]! @derivedFrom(field: "contract")
}
type Transfer @entity {
id: ID!
token: Token!
from: Owner
to: Owner
price: BigInt!
timestamp: BigInt! @index
block: Int! @index
transactionHash: String! @index
}
Now, before doing anything else, we need to install the indexer's dependencies. Let's make sure to navigate to the squid
subfolder, and then run the command:
npm i
In the same terminal window, launch the command:
make codegen
This is a command from Subsquid's SDK, which generates TypeScript models from this schema we just defined. You can find them under source/model/generated
. It has created one file for each one of the entities in the schema and each file defines a class with the same properties from the schema. This is an interface with the database and we are going to use this to save our data on the DB.
Smart contract and ABI
Now we need to make sure we are able to interact with the smart contract and to do this, we need the smart contract ABI. Luckily for us, it has been generated for us by Hardhat, and we can find it under the artifacts
folder. Let's head back to the terminal and run:
npx squid-evm-typegen src/abi ../artifacts/contracts/NFTMarketplace.sol/NFTMarketplace.json
This is another command from Subsquid's SDK, and it generates TypeScript facades for EVM transactions, logs and calls. It needs these two arguments: first argument is the destination folder, the second one is the ABI location.
It has created 3 files, the interesting one is src/abi/NFTMarketplace.ts
. It defines two objects:
one has
LogEvent
classes to decode smart contract Events and access their topicsanother one with
Func
classes that allow us to decode smart contract functions and access their input
And a Contract
class to directly call smart contract functions that are not payable
from our indexer. We are going to use the tokenURI()
function later on. Another use case for this class would be to fetch the contract's metadata (instead of hard-coding it, like I am going to do later on), by calling name()
, symbol()
and other similar functions.
// src/abi/NFTMarketplace.ts
import * as ethers from 'ethers'
import {LogEvent, Func, ContractBase} from './abi.support'
import {ABI_JSON} from './NFTMarketplace.abi'
export const abi = new ethers.utils.Interface(ABI_JSON);
export const events = {
Approval: new LogEvent<([owner: string, approved: string, tokenId: ethers.BigNumber] & {owner: string, approved: string, tokenId: ethers.BigNumber})>(
abi, '0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925'
),
ApprovalForAll: new LogEvent<([owner: string, operator: string, approved: boolean] & {owner: string, operator: string, approved: boolean})>(
abi, '0x17307eab39ab6107e8899845ad3d59bd9653f200f220920489ca2b5937696c31'
),
MarketItemCreated: new LogEvent<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>(
abi, '0xb640004f1d14576d0c209e240cad0410e0d8c0c33a09375861fbadae2588a98d'
),
Transfer: new LogEvent<([from: string, to: string, tokenId: ethers.BigNumber] & {from: string, to: string, tokenId: ethers.BigNumber})>(
abi, '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'
),
}
export const functions = {
approve: new Func<[to: string, tokenId: ethers.BigNumber], {to: string, tokenId: ethers.BigNumber}, []>(
abi, '0x095ea7b3'
),
balanceOf: new Func<[owner: string], {owner: string}, ethers.BigNumber>(
abi, '0x70a08231'
),
createMarketSale: new Func<[tokenId: ethers.BigNumber], {tokenId: ethers.BigNumber}, []>(
abi, '0xbe9af536'
),
createToken: new Func<[tokenURI: string, price: ethers.BigNumber], {tokenURI: string, price: ethers.BigNumber}, ethers.BigNumber>(
abi, '0x72b3b620'
),
fetchItemsListed: new Func<[], {}, Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>>(
abi, '0x45f8fa80'
),
fetchMarketItems: new Func<[], {}, Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>>(
abi, '0x0f08efe0'
),
fetchMyNFTs: new Func<[], {}, Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>>(
abi, '0x202e3740'
),
getApproved: new Func<[tokenId: ethers.BigNumber], {tokenId: ethers.BigNumber}, string>(
abi, '0x081812fc'
),
getListingPrice: new Func<[], {}, ethers.BigNumber>(
abi, '0x12e85585'
),
isApprovedForAll: new Func<[owner: string, operator: string], {owner: string, operator: string}, boolean>(
abi, '0xe985e9c5'
),
name: new Func<[], {}, string>(
abi, '0x06fdde03'
),
ownerOf: new Func<[tokenId: ethers.BigNumber], {tokenId: ethers.BigNumber}, string>(
abi, '0x6352211e'
),
resellToken: new Func<[tokenId: ethers.BigNumber, price: ethers.BigNumber], {tokenId: ethers.BigNumber, price: ethers.BigNumber}, []>(
abi, '0xe219fc75'
),
'safeTransferFrom(address,address,uint256)': new Func<[from: string, to: string, tokenId: ethers.BigNumber], {from: string, to: string, tokenId: ethers.BigNumber}, []>(
abi, '0x42842e0e'
),
'safeTransferFrom(address,address,uint256,bytes)': new Func<[from: string, to: string, tokenId: ethers.BigNumber, data: string], {from: string, to: string, tokenId: ethers.BigNumber, data: string}, []>(
abi, '0xb88d4fde'
),
setApprovalForAll: new Func<[operator: string, approved: boolean], {operator: string, approved: boolean}, []>(
abi, '0xa22cb465'
),
supportsInterface: new Func<[interfaceId: string], {interfaceId: string}, boolean>(
abi, '0x01ffc9a7'
),
symbol: new Func<[], {}, string>(
abi, '0x95d89b41'
),
tokenURI: new Func<[tokenId: ethers.BigNumber], {tokenId: ethers.BigNumber}, string>(
abi, '0xc87b56dd'
),
transferFrom: new Func<[from: string, to: string, tokenId: ethers.BigNumber], {from: string, to: string, tokenId: ethers.BigNumber}, []>(
abi, '0x23b872dd'
),
updateListingPrice: new Func<[_listingPrice: ethers.BigNumber], {_listingPrice: ethers.BigNumber}, []>(
abi, '0xae677aa3'
),
}
export class Contract extends ContractBase {
balanceOf(owner: string): Promise<ethers.BigNumber> {
return this.eth_call(functions.balanceOf, [owner])
}
fetchItemsListed(): Promise<Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>> {
return this.eth_call(functions.fetchItemsListed, [])
}
fetchMarketItems(): Promise<Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>> {
return this.eth_call(functions.fetchMarketItems, [])
}
fetchMyNFTs(): Promise<Array<([tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean] & {tokenId: ethers.BigNumber, seller: string, owner: string, price: ethers.BigNumber, sold: boolean})>> {
return this.eth_call(functions.fetchMyNFTs, [])
}
getApproved(tokenId: ethers.BigNumber): Promise<string> {
return this.eth_call(functions.getApproved, [tokenId])
}
getListingPrice(): Promise<ethers.BigNumber> {
return this.eth_call(functions.getListingPrice, [])
}
isApprovedForAll(owner: string, operator: string): Promise<boolean> {
return this.eth_call(functions.isApprovedForAll, [owner, operator])
}
name(): Promise<string> {
return this.eth_call(functions.name, [])
}
ownerOf(tokenId: ethers.BigNumber): Promise<string> {
return this.eth_call(functions.ownerOf, [tokenId])
}
supportsInterface(interfaceId: string): Promise<boolean> {
return this.eth_call(functions.supportsInterface, [interfaceId])
}
symbol(): Promise<string> {
return this.eth_call(functions.symbol, [])
}
tokenURI(tokenId: ethers.BigNumber): Promise<string> {
return this.eth_call(functions.tokenURI, [tokenId])
}
}
Implement mapping logic
Now that all of this is done, let's focus on the main logic, which is found in the processor.ts
file. As the name suggests, this is where we define how to process blockchain data. The template has a minimalistic setup, it's indexing transactions towards the Null address and it's printing a log every time it encounters a new transaction.
Before starting to make changes here, I want to mention we are going to use some environment variables. The list includes the address of the marketplace contract, which you can find in the logs of hardhat node, for example, or as a console message of the command that you use for deploying it. The rest of the variables are credentials for an Infura node and Infura IPFS service, to allow us to pin images of the NFTs, as well as their metadata. The Squid itself needs a couple more variables, and they have to do with the Archive's URL, which is the containers we launched earlier, and chain's RPC node, in our case this is the hardhat node.
Create a .env
file in the project's root and add these variables, then get an Infura (or equivalent) node and IPFS service, and put the correct values:
# .env
CLIENT_URL="http://localhost:8545"
NEXT_PUBLIC_SQUID_URL="http://localhost:4350/graphql"
NEXT_PUBLIC_WORKSPACE_URL=$CLIENT_URL
NEXT_PUBLIC_INFURA_NODE_ID=""
NEXT_PUBLIC_MARKETPLACE_ADDRESS=""
NEXT_PUBLIC_PRIVATE_KEY=""
NEXT_PUBLIC_IPFS_PROJECT_ID=""
NEXT_PUBLIC_IPFS_PROJECT_SECRET=""
Also, create a squid/.env
file and paste this content:
# squid/.env
DB_NAME=squid
DB_PORT=23798
GQL_PORT=4350
ETHEREUM_WSS="http://localhost:8545"
ARCHIVE_URL="http://localhost:8080"
Here is a summary of the changes we will need to make to processor.ts
:
change the data source to the
ARCHIVE_URL
environment variable, and add thechain
parameter, assigning theETHEREUM_WSS
variable (the name is actually misleading,http(s)
is also accepted)add a
addLog
function call to receive data for theMarketItemCreated
eventchange the current
addTransaction
function call to requestcreateMarketSale
function data, and add a secondaddTransaction
for theresellToken
function, as welladd a
MarketItemData
to save event/function data while processing the batch, before saving to the databaseimplement a
getOrCreateContractEntity
function to deal with theContract
model instance, which we'll treat as a singleton. This is where I am going to hard-code contract information, but you can go ahead and use the Contract class from our ABI facades file to fetch data directly from the on-chain contractimplement three functions named
handleMarketItemCreatedEvent
,handleCreateMarketSaleFunction
,handleResellTokenFunction
, to process Event/Function data, and return aMarketItemData
data interfaceimplement a
saveItems
function that accepts an array ofMarketItemData
(generated by the previous three functions processing the batch), that will save data models on the database in bulk, with some performance optimizations.change the code inside the
for
loops in the call back function of ofprocessor.run()
to verify what kind of item the current iteration has encountered, and call the righthandleX
function, then add theMarketItemData
to an array created at the topin the same call back function of
processor.run()
, at the very end, outside thefor
loops, call thesaveItems
function
And here's the final result:
// squid/src/processor.ts
import { Store, TypeormDatabase } from "@subsquid/typeorm-store";
import {
EvmBatchProcessor,
BlockHandlerContext,
LogHandlerContext,
TransactionHandlerContext,
} from "@subsquid/evm-processor";
import {
events,
functions,
Contract as ContractAPI,
} from "./abi/NFTMarketplace";
import { Contract, Owner, Token, Transfer } from "./model";
import { In } from "typeorm";
import { BigNumber } from "ethers";
import axios from "axios";
const contractAddress =
process.env.NEXT_PUBLIC_MARKETPLACE_ADDRESS?.toLowerCase() ||
"0x0000000000000000000000000000000000000000";
const processor = new EvmBatchProcessor()
.setDataSource({
chain: process.env.ETHEREUM_WSS,
archive: process.env.ARCHIVE_URL || "https://goerli.archive.subsquid.io/",
})
.addLog(contractAddress, {
filter: [[events.MarketItemCreated.topic]],
data: {
evmLog: {
topics: true,
data: true,
},
transaction: {
hash: true,
},
},
})
.addTransaction(contractAddress, {
sighash: functions.createMarketSale.sighash,
data: {
transaction: {
from: true,
input: true,
to: true,
},
},
})
.addTransaction(contractAddress, {
sighash: functions.resellToken.sighash,
data: {
transaction: {
from: true,
input: true,
to: true,
},
},
});
processor.run(new TypeormDatabase(), async (ctx) => {
const marketItemData: MarketItemData[] = [];
for (const block of ctx.blocks) {
for (const item of block.items) {
if (item.kind === "evmLog" && item.address === contractAddress) {
if (item.evmLog.topics[0] === events.MarketItemCreated.topic) {
ctx.log.info("found MarketItemCreated Event!");
const marketItemDatum = handleMarketItemCreatedEvent({
...ctx,
block: block.header,
...item,
});
marketItemData.push(marketItemDatum);
}
}
if (item.kind === "transaction" && item.address === contractAddress) {
if (
item.transaction.input.slice(0, 10) ===
functions.createMarketSale.sighash
) {
ctx.log.info("found createMarketSale transaction!");
const marketItemDatum = handleCreateMarketSaleFunction({
...ctx,
block: block.header,
...item,
});
marketItemData.push(marketItemDatum);
}
if (
item.transaction.input.slice(0, 10) === functions.resellToken.sighash
) {
ctx.log.info("found resellToken transaction!");
const marketItemDatum = handleResellTokenFunction({
...ctx,
block: block.header,
...item,
});
marketItemData.push(marketItemDatum);
}
}
}
}
await saveItems(
{
...ctx,
block: ctx.blocks[ctx.blocks.length - 1].header,
},
marketItemData
);
});
type MarketItemData = {
id: string;
from?: string;
to?: string;
tokenId: bigint;
price?: bigint;
forSale: boolean;
timestamp: bigint;
block: number;
transactionHash: string;
};
let contractEntity: Contract | undefined;
export async function getOrCreateContractEntity(
store: Store
): Promise<Contract> {
if (contractEntity == null) {
contractEntity = await store.get(Contract, contractAddress);
if (contractEntity == null) {
contractEntity = new Contract({
id: process.env.NEXT_PUBLIC_MARKETPLACE_ADDRESS,
name: "MassimoTest",
symbol: "XYZ",
totalSupply: 0n,
});
await store.insert(contractEntity);
}
}
return contractEntity;
}
function handleCreateMarketSaleFunction(
ctx: TransactionHandlerContext<
Store,
{
transaction: {
from: true;
input: true;
to: true;
};
}
>
): MarketItemData {
const { transaction, block } = ctx;
const { tokenId } = functions.createMarketSale.decode(transaction.input);
const transactionHash = transaction.input;
const addr = transaction.to;
const marketItem: MarketItemData = {
id: `${transactionHash}-${addr}-${tokenId.toBigInt()}-${transaction.index}`,
tokenId: tokenId.toBigInt(),
to: transaction.from || "",
forSale: false,
timestamp: BigInt(block.timestamp),
block: block.height,
transactionHash: transaction.input,
};
return marketItem;
}
function handleResellTokenFunction(
ctx: TransactionHandlerContext<
Store,
{
transaction: {
from: true;
input: true;
to: true;
};
}
>
): MarketItemData {
const { transaction, block } = ctx;
const { tokenId, price } = functions.resellToken.decode(transaction.input);
const transactionHash = transaction.input;
const addr = transaction.to?.toLowerCase();
const marketItem: MarketItemData = {
id: `${transactionHash}-${addr}-${tokenId.toBigInt()}-${transaction.index}`,
tokenId: tokenId.toBigInt(),
price: price.toBigInt(),
forSale: true,
timestamp: BigInt(block.timestamp),
block: block.height,
transactionHash: transaction.input,
};
return marketItem;
}
function handleMarketItemCreatedEvent(
ctx: LogHandlerContext<
Store,
{ evmLog: { topics: true; data: true }; transaction: { hash: true } }
>
): MarketItemData {
const { evmLog, transaction, block } = ctx;
// the owner and seller fields are mislabelled
// when a MarketItem is created, the "seller" is the owner
// the NFT is just temporarily transferred to the contract itself
// I decide to use the `sold` field to distinguish between
// "my nfts" and "my nfts up for sale", reversing the logic
const { tokenId, seller, owner, price, sold } =
events.MarketItemCreated.decode(evmLog);
const marketItem: MarketItemData = {
id: "", // `${transaction.hash}-${addr}-${tokenId.toBigInt()}-${evmLog.index}`,
tokenId: tokenId.toBigInt(),
to: seller,
price: price.toBigInt(),
forSale: !sold, // when the item is created, `sold` is set to false, so it's up for sale
timestamp: BigInt(block.timestamp),
block: block.height,
transactionHash: transaction.hash,
};
return marketItem;
}
async function saveItems(
ctx: BlockHandlerContext<Store>,
transfersData: MarketItemData[]
) {
const tokensIds: Set<string> = new Set();
const ownersIds: Set<string> = new Set();
for (const transferData of transfersData) {
tokensIds.add(transferData.tokenId.toString());
if (transferData.from) ownersIds.add(transferData.from.toLowerCase());
if (transferData.to) ownersIds.add(transferData.to.toLowerCase());
}
const transfers: Set<Transfer> = new Set();
const tokens: Map<string, Token> = new Map(
(await ctx.store.findBy(Token, { id: In([...tokensIds]) })).map((token) => [
token.id,
token,
])
);
const owners: Map<string, Owner> = new Map(
(await ctx.store.findBy(Owner, { id: In([...ownersIds]) })).map((owner) => [
owner.id,
owner,
])
);
for (const transferData of transfersData) {
const {
id,
tokenId,
from,
to,
block,
transactionHash,
price,
forSale,
timestamp,
} = transferData;
const contract = new ContractAPI(ctx, { height: block }, contractAddress);
// the "" case handles absence of sender, which means it's not an actual transaction
// likely, it's the token being minted and put up for sale
let fromOwner = owners.get(from || "");
if (from && fromOwner == null) {
fromOwner = new Owner({ id: from.toLowerCase() });
owners.set(fromOwner.id, fromOwner);
}
// the "" case handles absence of receiver, which means it's not an actual transaction
// likely it's the token being put up for re-sale
let toOwner = owners.get(to || "");
if (to && toOwner == null) {
toOwner = new Owner({ id: to.toLowerCase() });
owners.set(toOwner.id, toOwner);
}
const tokenIdString = tokenId.toString();
let token = tokens.get(tokenIdString);
let tokenURI,
name,
description,
imageURI = "";
try {
tokenURI = await contract.tokenURI(BigNumber.from(tokenId));
} catch (error) {
ctx.log.warn(`[API] Error during fetch tokenURI of ${tokenIdString}`);
if (error instanceof Error) ctx.log.warn(`${error.message}`);
}
if (tokenURI !== "") {
// fetch metadata and assign name, description, imageURI
const meta = await axios.get(tokenURI || "1");
imageURI = meta.data.image;
name = meta.data.name;
description = meta.data.description;
}
if (token == null) {
token = new Token({
id: tokenIdString,
uri: tokenURI,
name,
description,
imageURI,
price,
contract: await getOrCreateContractEntity(ctx.store),
});
tokens.set(token.id, token);
}
token.owner = toOwner;
token.forSale = forSale;
token.price = price || token.price; // change the price ONLY if changed by function/event
if (toOwner && fromOwner) {
const transfer = new Transfer({
id,
block,
timestamp,
transactionHash,
from: fromOwner,
to: toOwner,
price: token.price,
token,
});
transfers.add(transfer);
}
}
await ctx.store.save([...owners.values()]);
await ctx.store.save([...tokens.values()]);
await ctx.store.save([...transfers]);
}
Once we have implemented our code to decode all the data, processe it like we wanted, and save, we are ready test it out. To do so, let's start the database container. In a terminal, from the squid folder, launch the command:
make up
Then we need to build our TypeScript code:
make build
Then, one thing I didn't mention before, under the db/migrations
folder, there is a javascript file, which is used to initialize the database schema for us. We need to delete the old one, and generate a new one, by running the command:
make migration
Let's launch the processor, by running the command:
make process
We should be seeing logs, telling us we came across some events or functions. There's one more thing we can do, we can launch the GraphQL server. In a different terminal window, since the make process
command is blocking, run:
make serve
The log tells us which port the server is listening to: 4350. This means we could open our browser to the url: http://localhost:4350/graphql. Here, we can use the GraphQL playground to test some queries, and find out, for example, that this one:
query MyQuery {
tokens(orderBy: id_ASC, where: {AND: [
{owner: { id_eq: "INSERT_ADDRESS_HERE" }},
{forSale_eq: true}
]}) {
description
forSale
id
imageURI
name
price
uri
owner {
id
}
}
}
Can be used to fetch NFTs being listed by a certain user, and it's going to be useful in the next section.
Changing the App to use our API
Once the API is up and running, we can go back to our IDE, and start changing the App pages, to use it, instead of querying the contract. The create-nft.js
file does not need our intervention, because the contract interaction is necessary for the minting.
Dashboard page
But we can start with the dashboard.js
. Here we have this section, in the loadNFTs
function, which we are going to replace.
// pages/dashboard.js
// ...
async function loadNFTs() {
// ...
const contract = new ethers.Contract(marketplaceAddress, NFTMarketplace.abi, signer)
const data = await contract.fetchItemsListed()
const items = await Promise.all(data.map(async i => {
const tokenUri = await contract.tokenURI(i.tokenId)
const meta = await axios.get(tokenUri)
let price = ethers.utils.formatUnits(i.price.toString(), 'ether')
let item = {
price,
tokenId: i.tokenId.toNumber(),
seller: i.seller,
owner: i.owner,
image: meta.data.image,
}
return item
}))
setNfts(items)
setLoadingState('loaded')
}
// ...
In order to use the squid API, we need to:
get the address of the person logged in with Metamask wallet
set
headers
of an API requestcreate a
requestBody
object, containing the GraphQL query we had developed at the end of last chapter, with a caveat: it will accept a variable, containing the address of the connected Metamask wallet, as theowner
of the NFTscreate an
options
objects for API request optionsuse the
axios
library to perform an API request, using theoptions
we createdprocess the response into an
items
array, similarly to the code snippet abovecall the
setNfts
function, providing theitems
array we just created
Here's the end result:
// pages/dashboard.js
import { ethers } from 'ethers'
import { useEffect, useState } from 'react'
import axios from 'axios'
import Web3Modal from 'web3modal'
export default function CreatorDashboard() {
const [nfts, setNfts] = useState([])
const [loadingState, setLoadingState] = useState('not-loaded')
useEffect(() => {
loadNFTs()
}, [])
async function loadNFTs() {
const web3Modal = new Web3Modal({
network: 'mainnet',
cacheProvider: true,
})
const connection = await web3Modal.connect()
const provider = new ethers.providers.Web3Provider(connection)
const signer = provider.getSigner()
const owner = (await signer.getAddress()).toLowerCase();
const headers = {
'content-type': 'application/json',
};
const requestBody = {
query: `query MyQuery ($owner: String!){
tokens(orderBy: id_ASC, where: {AND: [
{owner: { id_eq: $owner }},
{forSale_eq: true}
]}) {
description
forSale
id
imageURI
name
price
uri
owner {
id
}
}
}`,
variables: { owner }
};
const options = {
method: 'POST',
url: process.env.NEXT_PUBLIC_SQUID_URL,
headers,
data: requestBody
};
try {
const response = await axios(options);
const squiditems = await Promise.all(response.data.data.tokens.map(async i => {
let item = {
price: ethers.utils.formatUnits(i.price, 'ether'),
tokenId: Number(i.id),
owner: i.owner.id,
image: i.imageURI,
tokenURI: i.uri,
name: i.name,
description: i.description,
}
return item
}))
setNfts(squiditems)
setLoadingState('loaded')
} catch (err) {
console.log('ERROR DURING AXIOS REQUEST', err);
}
}
if (loadingState === 'loaded' && !nfts.length) return (<h1 className="py-10 px-20 text-3xl">No NFTs listed</h1>)
return (
<div>
<div className="p-4">
<h2 className="text-2xl py-2">Items Listed</h2>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-4 gap-4 pt-4">
{
nfts.map((nft, i) => (
<div key={i} className="border shadow rounded-xl overflow-hidden">
<img src={nft.image} className="rounded" />
<div className="p-4 bg-black">
<p className="text-2xl font-bold text-white">Price - {nft.price} Eth</p>
</div>
</div>
))
}
</div>
</div>
</div>
)
}
Home page
We can now apply the same changes to the other pages, starting with the index.js
file, which manages the home page.
In the home page, we see all NFTs up for sale, regardless of the owner. So this means the query we used before can be recycled, we just need to get rid of the second condition, and every mention of the variable, because it's no longer used.
// pages/index.js
import { ethers } from 'ethers'
import { useEffect, useState } from 'react'
import axios from 'axios'
import Web3Modal from 'web3modal'
import {
marketplaceAddress
} from '../config'
import NFTMarketplace from '../artifacts/contracts/NFTMarketplace.sol/NFTMarketplace.json'
export default function Home() {
const [nfts, setNfts] = useState([])
const [loadingState, setLoadingState] = useState('not-loaded')
useEffect(() => {
loadNFTs()
}, [])
async function loadNFTs() {
const headers = {
'content-type': 'application/json',
}
const requestBody = {
query: `query AllNFTsForSale {
tokens(where: {forSale_eq: true}, orderBy: id_ASC) {
description
forSale
id
imageURI
name
price
uri
owner {
id
}
}
}
`,
}
const options = {
method: 'POST',
url: process.env.NEXT_PUBLIC_SQUID_URL,
headers,
data: requestBody
}
try {
const response = await axios(options);
const squiditems = await Promise.all(response.data.data.tokens.map(async i => {
let item = {
price: ethers.utils.formatUnits(i.price, 'ether'),
tokenId: Number(i.id),
owner: i.owner.id,
image: i.imageURI,
tokenURI: i.uri,
name: i.name,
description: i.description,
}
return item
}))
setNfts(squiditems)
setLoadingState('loaded')
} catch (err) {
console.log('ERROR DURING AXIOS REQUEST', err);
}
}
async function buyNft(nft) {
/* needs the user to sign the transaction, so will use Web3Provider and sign it */
const web3Modal = new Web3Modal()
const connection = await web3Modal.connect()
const provider = new ethers.providers.Web3Provider(connection)
const signer = provider.getSigner()
const contract = new ethers.Contract(marketplaceAddress, NFTMarketplace.abi, signer)
/* user will be prompted to pay the asking proces to complete the transaction */
const price = ethers.utils.parseUnits(nft.price.toString(), 'ether')
const transaction = await contract.createMarketSale(nft.tokenId, {
value: price
})
await transaction.wait()
loadNFTs()
}
if (loadingState === 'loaded' && !nfts.length) return (<h1 className="px-20 py-10 text-3xl">No items in marketplace</h1>)
return (
<div className="flex justify-center">
<div className="px-4" style={{ maxWidth: '1600px' }}>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-4 gap-4 pt-4">
{
nfts.map((nft, i) => (
<div key={i} className="border shadow rounded-xl overflow-hidden">
<img src={nft.image} />
<div className="p-4">
<p style={{ height: '64px' }} className="text-2xl font-semibold">{nft.name}</p>
<div style={{ height: '70px', overflow: 'hidden' }}>
<p className="text-gray-400">{nft.description}</p>
</div>
</div>
<div className="p-4 bg-black">
<p className="text-2xl font-bold text-white">{nft.price} ETH</p>
<button className="mt-4 w-full bg-pink-500 text-white font-bold py-2 px-12 rounded" onClick={() => buyNft(nft)}>Buy</button>
</div>
</div>
))
}
</div>
</div>
</div>
)
}
My NFTs page
Here we load all the NFTs in possession of the Metamask wallet address, which are not for sale. So once again, the initial query can be recycled, this time we need to switch the forSale
logic to false
.
// pages/my-nfts.js
import { ethers } from 'ethers'
import { useEffect, useState } from 'react'
import axios from 'axios'
import Web3Modal from 'web3modal'
import { useRouter } from 'next/router'
const auth =
'Basic ' + Buffer.from(process.env. NEXT_PUBLIC_IPFS_PROJECT_ID + ':' + process.env. NEXT_PUBLIC_IPFS_PROJECT_SECRET).toString('base64');
export default function MyAssets() {
const [nfts, setNfts] = useState([])
const [loadingState, setLoadingState] = useState('not-loaded')
const router = useRouter()
useEffect(() => {
loadNFTs()
}, [])
async function loadNFTs() {
const web3Modal = new Web3Modal({
network: "mainnet",
cacheProvider: true,
})
const connection = await web3Modal.connect()
const provider = new ethers.providers.Web3Provider(connection);
const signer = provider.getSigner();
const owner = await (await signer.getAddress()).toLowerCase();
const headers = {
'content-type': 'application/json',
};
const requestBody = {
query: `query MyQuery ($owner: String!){
tokens(orderBy: id_ASC, where: {AND: [
{owner: { id_eq: $owner }},
{forSale_eq: false}
]}) {
description
forSale
id
imageURI
name
price
uri
owner {
id
}
}
}`,
variables: { owner }
};
const options = {
method: 'POST',
url: process.env.NEXT_PUBLIC_SQUID_URL,
headers,
data: requestBody
};
try {
const response = await axios(options);
const squiditems = await Promise.all(response.data.data.tokens.map(async i => {
let item = {
price: ethers.utils.formatUnits(i.price, 'ether'),
tokenId: Number(i.id),
owner: i.owner.id,
image: i.imageURI,
tokenURI: i.uri,
name: i.name,
description: i.description,
}
return item
}))
setNfts(squiditems)
setLoadingState('loaded')
}
catch (err) {
console.log('ERROR DURING AXIOS REQUEST', err);
}
}
function listNFT(nft) {
console.log('nft:', nft)
router.push(`/resell-nft?id=${nft.tokenId}&tokenURI=${nft.tokenURI}`)
}
if (loadingState === 'loaded' && !nfts.length) return (<h1 className="py-10 px-20 text-3xl">No NFTs owned</h1>)
return (
<div className="flex justify-center">
<div className="p-4">
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-4 gap-4 pt-4">
{
nfts.map((nft, i) => (
<div key={i} className="border shadow rounded-xl overflow-hidden">
<img src={nft.image} className="rounded" />
<div className="p-4 bg-black">
<p className="text-2xl font-bold text-white">Price - {nft.price} Eth</p>
<button className="mt-4 w-full bg-pink-500 text-white font-bold py-2 px-12 rounded" onClick={() => listNFT(nft)}>List</button>
</div>
</div>
))
}
</div>
</div>
</div>
)
}
Finally, the resell-nft.js
page, similarly to create-nft.js
does not need any changes.
Tests and Goerli deployment
Next.js has auto reload, so every change in our code is automatically considered. So if we reload the page, and open the inspector, in the Networks tab, we can see that there's a graphql request, and that's where the data to show it in the page comes from.
We managed to develop an NFT API, by indexing blockchain data, and we integrated it in our project. All of this was done on our local development environment, which means when this application will go live, the changes are going to be minimal.
As a matter of fact, it is possible to deploy this project on Goerli, the Ethereum testnet, and index it right away.
First of all, if you don't have them already, set up you Metamask with a real ETH address (I advise you to use a throw-away account for these tests), then head over to a faucet and get some test ETH, if you don't have enough already.
And finally, we need to deploy our contract to the testnet. From the project's root folder, in a terminal, run the command:
npx hardhat run scripts/deploy.js --network goerli
The good news is, to re-configure our App and index our contract, we just need to change a couple of environment variables:
contract address on Goerli (note it down, after launching the deploy script)
Ethereum rpc endpoint
the archive URL, because we are going to use one of the many official Archives from Subsquid
# .env
# ...
NEXT_PUBLIC_MARKETPLACE_ADDRESS="NEW_ADDRESS"
# ...
# squid/.env
# ...
ETHEREUM_WSS="https://goerli.infura.io/v3/YOUR_INFURA_ID"
ARCHIVE_URL="https://goerli.archive.subsquid.io/"
We can stop our squid, and the Next.js app, then launch them again with the new configuration. Bear in mind, to properly test out the project, you'll need to switch accounts in Metamask to a real ETH address.
Conclusions
With just a couple of changes to environment variables, we are testing out our application when the contract is deployed on testnet, without having to rewrite our DApp. I chose to deploy the project on Goerli, but Subsquid just announced Polygon compatibility, so you can also deploy it on Mumbai network.
This is, in my opinion, a very good way to future-proof your application, making sure that the user experience is not degraded by the RPC bottleneck.
At Subsquid is continuously developing, and will continue adding compatibilities to new networks, and deploy archives for them. So head over to the website, and to the docs to know more, and follow Subsquid on social media to stay up to date.
If you like to see more of these demo projects, follow me on Twitter, and feel free to reach out to me, and request/suggest in the comments.