Page cover image

NFT Creator Hub: Generation and Management

1.1 NFT Creator Hub Overview

NFT Creator Hub provides a comprehensive suite of tools for artists and content creators to generate, manage, and deploy NFTs. This documentation focuses on automating the art generation process, metadata creation, and how to structure these NFTs for blockchain compatibility, allowing creators to produce and manage NFTs efficiently.


1.2 Art Generation and Metadata Setup

Generating Art Layers

NFT art generation typically uses layers (background, body, accessories, etc.). Each element in a layer can be uniquely combined to produce various NFTs. Before generating NFTs, creators need to design and categorize these assets into folders, each representing a specific layer.

Folder Structure

β”œβ”€β”€ layers
β”‚   β”œβ”€β”€ background
β”‚   β”œβ”€β”€ body
β”‚   β”œβ”€β”€ accessories
β”‚   └── expressions

Combining Layers for Art Generation

A script (usually in Python) can automate the process of combining these layers to produce a complete set of unique images. The Pillow (PIL) library in Python is effective for layering and outputting image files.

Metadata Creation and Structuring

Metadata is essential for defining each NFT's properties and follows the ERC-721 standard. Each NFT needs a JSON metadata file that describes its attributes (e.g., background, rarity) and points to the asset image.

Metadata Structure

{
  "name": "NFT #1",
  "description": "An exclusive collectible from the NFT Creator Hub",
  "image": "ipfs://<image_hash>/1.png",
  "attributes": [
    { "trait_type": "Background", "value": "Blue" },
    { "trait_type": "Accessory", "value": "Hat" },
    { "trait_type": "Expression", "value": "Happy" }
  ]
}

1.3 Code for Art and Metadata Generation

Python Script for Automatic Art Generation

Below is a Python example script that combines individual layer images into complete artwork and saves them in a specified output folder.

from PIL import Image
import os
import json

# Define paths for each layer and output folder
layers = {
    "background": "./layers/background",
    "body": "./layers/body",
    "accessories": "./layers/accessories",
    "expressions": "./layers/expressions"
}
output_folder = "./output"
metadata_folder = "./metadata"

# Load layer images from each folder
def load_images(layer_path):
    return [Image.open(os.path.join(layer_path, f)) for f in os.listdir(layer_path)]

def generate_image(count, backgrounds, bodies, accessories, expressions):
    # Generate images by combining random elements from each layer
    background = backgrounds[count % len(backgrounds)]
    body = bodies[count % len(bodies)]
    accessory = accessories[count % len(accessories)]
    expression = expressions[count % len(expressions)]
    
    # Combine layers
    combined_image = Image.alpha_composite(background, body)
    combined_image = Image.alpha_composite(combined_image, accessory)
    combined_image = Image.alpha_composite(combined_image, expression)
    
    # Save generated image
    combined_image.save(f"{output_folder}/{count}.png")

def generate_metadata(count, traits):
    # Generate metadata JSON structure
    metadata = {
        "name": f"NFT #{count}",
        "description": "An exclusive collectible from the NFT Creator Hub",
        "image": f"ipfs://<image_hash>/{count}.png",
        "attributes": traits
    }
    
    # Save metadata as JSON
    with open(f"{metadata_folder}/{count}.json", "w") as f:
        json.dump(metadata, f, indent=4)

def main():
    backgrounds = load_images(layers["background"])
    bodies = load_images(layers["body"])
    accessories = load_images(layers["accessories"])
    expressions = load_images(layers["expressions"])
    
    # Generate images and metadata for each unique NFT
    for i in range(10):  # Example: generating 10 NFTs
        generate_image(i, backgrounds, bodies, accessories, expressions)
        
        # Sample traits data for metadata
        traits = [
            {"trait_type": "Background", "value": "Blue"},
            {"trait_type": "Accessory", "value": "Hat"},
            {"trait_type": "Expression", "value": "Happy"}
        ]
        generate_metadata(i, traits)

if __name__ == "__main__":
    os.makedirs(output_folder, exist_ok=True)
    os.makedirs(metadata_folder, exist_ok=True)
    main()

This script:

  1. Loads layer images from specified folders.

  2. Combines layers to create unique images.

  3. Generates metadata files in JSON format for each NFT.

Dependencies

Pillow library for image handling. Install using:

pip install pillow

1.4 Configuration and Deployment of NFTs

After creating the artwork and metadata, the next steps involve storing the assets on a decentralized file storage system (like IPFS) and deploying the NFT smart contracts on a blockchain network.

Steps for Deployment

  1. Upload Images to IPFS: Use a tool like Pinata or Infura to upload images to IPFS and obtain IPFS hashes.

  2. Link Metadata to IPFS Hashes: Update the "image" field in each metadata file to include the IPFS link.

  3. Deploy NFT Contract: Use a contract such as ERC-721 on Ethereum or other blockchains like Polygon or Binance Smart Chain for deployment.

ERC-721 Deployment using Solidity

// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;

import "@openzeppelin/contracts/token/ERC721/ERC721.sol";
import "@openzeppelin/contracts/access/Ownable.sol";

contract NFTCreatorHub is ERC721, Ownable {
    uint256 public nextTokenId;
    string public baseURI;

    constructor(string memory _baseURI) ERC721("NFTCreatorHub", "NFTCH") {
        baseURI = _baseURI;
    }

    function mintNFT(address to) public onlyOwner {
        _safeMint(to, nextTokenId);
        nextTokenId++;
    }

    function setBaseURI(string memory _baseURI) external onlyOwner {
        baseURI = _baseURI;
    }

    function _baseURI() internal view override returns (string memory) {
        return baseURI;
    }
}

Deploying and Testing with Hardhat

  1. Set up a Hardhat project

mkdir nft-creator-hub
cd nft-creator-hub
npm init -y
npm install hardhat @openzeppelin/contracts
  1. Deploy using Hardhat and verify that baseURI points to the IPFS location

Last updated