How To Build An NFT Token-Gated Blog with Ghost

Incorporating NFTs into your private blogging community

Justin Hunter
17 min readAug 11, 2022

Gated content is not a new concept. Every paywall you interact with is a gate. Substack subscriptions? That’s a gate. Medium membership? That’s a gate. Newspaper subscription walls? Another gate. But what if you could build your own gated solution with the flexibility to incorporate modern technology like NFTs?

In this article, we’re going to leverage Ghost, the phenomenal open source blogging platform, as our “front end” for creating posts. We’ll use Pinata’s Submarine API (and the Submarine SDK) to then upload completed posts from Ghost to your own private IPFS node. And finally, we’ll build a nice front end to allow for unlocking and delivery of the content.

To get started, you’ll need a Pinata account on the Picnic Plan or above. Pinata Submarine and the Dedicated Gateway you’ll need to manage and serve the blog posts are only available on paid plans. Once you’ve signed up for your account, you’ll want to make sure you have a dedicated gateway created. You can follow our guide here.

Don’t have a Pinata account yet? Get one here.

Setting Up Your Dev Environment

For this tutorial, we’re going to be working with Node.js, the command line, and Next.js. So, you’ll want to:

We’re going to start by creating a project directory that will house two things:

  1. A locally running instance of Ghost
  2. A Next.js app to manage the backend and frontend of our token-gated blog

Run the following command from the root of whatever directory you use for all your crazy side projects:

mkdir token-gated-blog && cd token-gated-blog

Within that folder, we’re going to get ourself set up with the skeleton for the two items mentioned above.

First we will create a bare bones Next.js project:

npx create-next-app blog

And finally, we will create a directory for our Ghost instance:

mkdir ghost-app

The final piece of set up is going to be setting up the Ghost admin app. To do so, you’ll need to install the Ghost CLI. You can follow the full instructions here.

To install the CLI, run:

npm install ghost-cli@latest -g

When that’s done, change into the empty directory you created for the Ghost admin instance:

cd ghost-blog

Then, you can run:

ghost install local

When that process is complete, your Ghost instance will be up and running. You can start and stop it from the folder you’re in with the Ghost CLI using ghost start or ghost stop.

One final step is to visit the locally running instance and create an account. Go to http://localhost:2368/ghost/ and create your admin account.

Now we’re ready to get to work!

Connecting Ghost to Pinata

We want to make sure we can publish our posts to Pinata Submarine, but we also want to make sure that any media we add to our posts get uploaded to IPFS. So, let’s start implementing a Pinata storage adapter for Ghost.

We’ve already written a storage adapter to make thing easier for you. In the ghost-app directory in your terminal, we will need to create a few directories. First run:

cd content

Next, we will create an adapters directory:

mkdir adapters && cd adapters

Then, we will make a storage directory:

mkdir storage && cd storage

Finally, we need to clone the Pinata storage adapter repostiroy:

git clone https://github.com/PinataCloud/pinata-ghost-storage

When that is complete, change into the directory and install dependencies:

cd pinata-ghost-storage && npm i

After everything is installed, you just need to tell Ghost to use that storage adapter. So, let’s back out of each directory until we are back in the ghost-app directory. In that directory, there is a config.development.json file. Open that in your code editor.

In that file, we need to add the following (I’m adding it under the database configuration section:

"storage": {
"active": "pinata-ghost-storage",
"pinata-ghost-storage": {
"gatewayUrl": "https://yourgatewayurl.com",
"pinataKey": "YOUR PINATA API KEY",
"pinataSecret": "YOUR PINATA API SECRET"
}
},

The storage adapter assumes you are ok with the images from your posts living on the public IPFS network. The post itself will be Submarined, but the images in the post will be references on IPFS’s public network.

Save the configuration file, then go back to your terminal. In the ghost-app directory, run:

ghost restart

This will apply the storage adapter changes and we’re ready to move on to the next way we will connect Ghost to Pinata.

Because we’re running a local version of Ghost, we have all Ghost’s features available to us. One of those features is the ability to leverage webhooks. We will use this to trigger the publication of a post to Pinata’s Submarine API when the post is published locally in our Ghost instance.

You might be asking why we aren’t just using Ghost for this. Two reasons:

  1. We are running Ghost locally and we want the content to be available everywhere.
  2. We want to token-gate content as opposed to just setting up a traditional membership wall.

To set up webhooks, go to the settings page in your locally running instance of Ghost. You can get there at this URL: http://localhost:2368/ghost/#/settings

On that page, click the Integrations option:

On the Integrations page, scroll down and choose “Add custom integration”:

On the next page after you click “Create” scroll down to the button that says “Add webhook”. Click that and label this webhook as the publishing hook:

The event, as the name would suggest, is when a post is published. The target URL will temporarily point to local host. We’ll update this when our main Next.js app is deployed. For now, localhost:3000 is the most common local port for Next.js apps, so we’ll use that.

As you can see, we are going to post event info to an api route called publishPost.

Save that and let’s create one more for updating published posts:

Finally, you’ll want to create one more hook for deleting posts:

With all of these saved, we are ready to start building our Next.js app. Don’t worry, you’ll be able to write a blog post soon enough!

In another terminal/command line window change into your Next.js project directory:

cd ../blog

Open the code in your favorite code editor then go ahead and start the app with this command:

npm run dev

You’ll soon see that the app is running on localhost:3000. The beauty of Next.js is it has both server and client capabilities thanks to server-side rendering for the client and lambda functions for the server.

Take a look at the code for the Next.js app. There is a pages folder, and within that folder there is an api folder. This is where all our server side API routes will live. These will ultimately be serverless functions when the app is deployed. That api folder has one route as an example. Delete that and add five new files:

  1. publishPost.js
  2. updatePost.js
  3. deletePost.js
  4. getPosts.js
  5. [cid].js

These API routes correspond to the URLs we entered for the webhooks in the Ghost integration settings. We will use them to be notified of changes we make and then we can take the appropriate action on the backend.

Let’s get a quick tl;dr for what we plan to do with these three endpoints.

publishPost

This endpoint will receive an event payload from the Ghost admin app indicating a new post was published. We will leverage the Ghost API to get the post in question and upload the content to the Pinata Submarine API. This will keep the content on a private IPFS node we control.

updatePost

This endpoint will take an event payload from the Ghost admin app that indicates that a published post has been updated. We need to take that information, upload a new version of the post to the Pinata Submarine API and then make sure we indicate somehow that this is the post that should be displayed for authenticated users.

deletePost

This one is pretty straightforward.

Let’s start by wiring up the publishPost functionality. Before we can do anything though, we need to generate a Submarine Key that we can use within our code.

Log into your Pinata account, and click the avatar menu in the top-right then choose Submarine Keys:

Next, you’ll be able to generate a new Submarine API key. Copy that key and add it to a .env.local file in the root of your Next.js project like so and go ahead and add your Dedicated Gateway URL there as well:

SUBMARINE_KEY=YOUR SUBMARINE API KEY
DEDICATED_GATEWAY_URL=https://yourgateway.com
CONTRACT_ADDRESS=NFT CONTRACT ADDRESS
BLOCKCHAIN=Ethereum
NETWORK=Mainnet

While we’re in the .env.local file, we should add info we’ll need later. We will need to verify ownership of a particular NFT, so we’ll need to know which chain, which network, and what contract address to check on. We can store that info in the .env.local file for ease.

We’ll also want to make use of the Pinata Submarine SDK to make our lives easier. So let’s install that. In your terminal, within the blog app, run:

npm i pinata-submarine

Now we’re ready to write some code. Inside the api/publishPost.js file, add the following:

import { Submarine } from "pinata-submarine";const submarine = new Submarine(process.env.SUBMARINE_KEY, process.env.DEDICATED_GATEWAY_URL);export default async function handler(req, res) {
if(req.method === "POST") {
try {
const { post } = req.body;
const { current } = post;
const metadata = {
title: current.title,
feature_image: current.feature_image,
featured: current.featured,
excerpt: current.custom_excerpt,
tags: JSON.stringify(current.tags),
published: "true"
}
await submarine.uploadJson(current, current.id, 1, metadata);

res.send("Success");
} catch (error) {
console.log(error);
res.status(500).json(error);
}
}
}

Not much code involved. That’s the beauty of the Pinata Submarine SDK. Let’s take a look at what’s happening here, though.

The event published by our Ghost admin app includes the full JSON schema for the published post. If you’d like to learn more about that schema, take a look at Ghost’s docs here. We take that JSON, extract the current post information and we upload that as JSON directly to Pinata and into our private IPFS node using Submarine.

Next, we update the Pinata metadata associated with that file to make it easier to find later and to show it nicely in the eventual front end app we’ll build. And that’s it.

Let’s connect the remaining two endpoints in a similar way, starting with updatePost.js.

import { Submarine } from "pinata-submarine";
const submarine = new Submarine(process.env.SUBMARINE_KEY, process.env.DEDICATED_GATEWAY_URL);
export default async function handler(req, res) {
if(req.method === "POST") {
try {
const { post } = req.body;
const { current } = post;
const options = {
name: current.id,
metadata: JSON.stringify({
published: "true"
})
}
const content = await submarine.getSubmarinedContent(options); if(content.length > 0) {
const { id } = content[0];
await submarine.deleteContent(id);
const metadata = {
title: current.title,
feature_image: current.feature_image,
featured: current.featured,
excerpt: current.custom_excerpt,
tags: JSON.stringify(current.tags),
published: "true"
}
await submarine.uploadJson(current, current.id, 1, metadata);
} else {
throw "No content found"
}
res.send("Success");
} catch (error) {
console.log(error);
res.status(500).json(error);
}
}
}

There’s a little more going on here, but that’s simply the product of us updating content instead of just publishing it. We still take an event payload that looks similar to the publish post payload. But here, we need to first look up the Submarined content using the Pinata Submarine SDK.

Once we’ve found the content, we delete it. Next, we upload the new version of the content the same way we would if we were publishing a post for the first time.

That’s it! Simple.

Finally, let’s set up our deletePost.js route:

import { Submarine } from "pinata-submarine";
const submarine = new Submarine(process.env.SUBMARINE_KEY, process.env.DEDICATED_GATEWAY_URL);
export default async function handler(req, res) {
if(req.method === "POST") {
try {
const { post } = req.body;
const { previous } = post;
const options = {
name: previous.id,
metadata: JSON.stringify({
published: "true"
})
}
const content = await submarine.getSubmarinedContent(options); if(content.length > 0) {
const { id } = content[0];
await submarine.deleteContent(id);
} else {
throw "Content not found"
}
res.send("Success");
} catch (error) {
console.log(error);
res.status(500).json(error);
}
}
}

This shares some similarities with the update function but obviously we don’t need to repost new content. One other difference is in the event payload we destructure. Instead of looking for the current property for the post, we are looking for the previous property. This is because in Ghost’s admin account the post has already been deleted, so the current object is empty.

We have two more routes to wire up, but we’re only going to do one of them right now. We’re going to focus on the getPosts route. Add the following to that file:

import { Submarine } from "pinata-submarine";
const submarine = new Submarine(process.env.SUBMARINE_KEY, process.env.DEDICATED_GATEWAY_URL);

export default async function handler(req, res) {
if(req.method === "GET") {
try {
const { offset, limit } = req.query;
const options = {
metadata: JSON.stringify({
ghostBlog: {
value: "true",
op: "eq"
}
}),
offset,
limit
}
const content = await submarine.getSubmarinedContent(options); if(content.length > 0) {
return res.json(content);
} else {
return res.json([]);
}
} catch (error) {
console.log(error);
res.status(500).json(error);
}
}
}

In this, we are using a the identifier we specified to query Submarined files’ metadata and only return those files that match. We use an offset and a limit to set up up for pagination, but actual pagination will be outside the scope of this tutorial.

The results of this request will give us enough info to build a post list without revealing the actual contents of the post.

We’ll leave the [cid].js file for later and move on from here for now. We’ve connected Ghost to Pinata. Now, we need to build our app and token-gating.

Build The Token-Gating App

Let’s talk about what this app needs to do. It needs to show a list of posts but not the content of those posts. Ghost has a nice excerpt feature, which if you use will be exposed as part of the JSON we’re storing with Pinata Submarine. So, if you have an excerpt, you might want to display that but not the whole post. The same is true for feature images, etc.

Since we’re using Next.js and we did a nice job setting up our posts to include metadata that would be useful for showing on a Posts page, we can make use of the built-in getServerSideProps function. Let’s set this up in our pages/index.js file and set up the layout a little bit as well. Replace the index.js code with the following:

In addition to some basic layout changes, you can see there are two components that we need to create: Skeleton.js and Posts.js

We’ll create those in a second, but let’s look at what’s going on in the rest of this file. We’re going to start at the bottom and take a look at the getServerSideProps function. This function runs before the client side code is generated and then it is injected into the client code. As you can see, we’re making a request to our backend to get posts. We hardcode the offset and limit because this is for first load. Subsequent requests (page changes if you set up pagination) would happen from the client directly.

The result of this function is passed as props to the client side component. We can then use that to render posts. We take the post data and pass each mapped post into a post component. You’ll also note that we have a skeleton component set up for our loading state.

Let’s create our components. In the root of your project directory add a new folder called components. Inside that folder, add a file called Skeleton.js. Let’s add the following to that file:

This will give a nice shimmering empty state post while we load our content. Next, let’s set up the posts component. Create another file in the components folder called Posts.js. Add the following to that file:

import Link from 'next/link';
import React from 'react';
const Posts = ({ post }) => {
const { metadata, createdAt, cid } = post;
const { title, excerpt } = metadata;
return (
<Link href={`/${cid}`}>
<div className="card cursor">
<h1>{title}</h1>
<p>{createdAt}</p>
<p>{excerpt}</p>
<h3>Read More</h3>
</div>
</Link>
)
}
export default Posts;

Pretty simple React component here. We are taking the post info, destructuring the metadata, and using that to build our basic post info. We wrap the entire post code in a Link element from Next.js. By clicking on the post component, the user will be taken to a single post page mapped to the Submarined file’s CID.

Now, we just need to build the page that lets us authenticate a reader and serve the content if they are authenticated. For the sake of this tutorial, we’ll be using the personal_sign method in Ethereum, but in a production app, you might consider using Sign In With Ethereum to manage sessions.

Let’s create a new page in the pages folder. It should be called [cid].js. This is called dynamic routing in Next.js. It will render the page no matter what that cid value is, but we can then use that value dynamically when making other requests. Add the following to your [cid].js file:

import { useRouter } from 'next/router'
import React, { useEffect, useState } from 'react'
import Authenticate from '../components/Authenticate';
import Post from '../components/Post';
const SinglePost = () => {
const [cid, setCID] = useState("");
const [authenticated, setAuthenticated] = useState(false);
const [postContent, setPostContent] = useState(null);
const router = useRouter(); const { query } = router; useEffect(() => {
if(query && query.cid) {
setCID(query.cid);
}
}, [query]);
if(authenticated) {
return <Post cid={cid} postContent={postContent} />
}
return (
<div>
<Authenticate cid={cid} setAuthenticated={setAuthenticated} setPostContent={setPostContent} />
</div>
)
}
export default SinglePost;

As you can see right from the top of the file, we’re going to need to create two new files in our components folder: Authenticate and Post. Outside of that, let’s take a look at what’s going on in this file.

We are using Next.js’s built-in router to find the query parameter (the CID). We are using the useState hook to set the CID, but then we’re using it to pass state capabilities to the components we’re going to create.

Let’s create each of those components to get a better idea of how this all fits together. Go ahead and create a file in the components folder called Authenticate.js. Inside that folder, there’s going to be a lot going on. So we’ll use a Gist again to show what you should add. Add the following to that file:

Right off the bat, you might notice we are making use of a library we haven’t installed yet. The wagmi library is a library for React applications that gives easy Ethereum hooks. We’ll use it for connecting to wallets and signing messages.

In fact, that’s exactly what’s happening in this file. We check if the user’s wallet is connected and conditionally render the appropriate content. If the wallet is not connected, we leverage the wagmi library’s hooks to show a bunch of wallet options. When the wallet is connected, the screen updates with a Verify NFT button and a Disconnect button. The Disconnect button will remove the wallet connect and allow the user to select a different wallet provider. The Verify NFT is what we really care about, though.

When that button is clicked, we make a GET request to our API (remember that route we still need to build out on the backend? We’ll do that in a moment, but you can see the request we make results in a message being returned that we need to sign. We set a unique identifier that will be used soon in state and we pass the message data to the signMessage function provided by wagmi.

This will trigger the user’s wallet provider to display the message and ask if they want to sign. In the useSignMessage hook, there is an onSuccess callback. When that code runs, it means the user successfully signed the message. We need to use both that message id we saved earlier and the signature together, so we save the signature result of the user signing the message to state. Then, we make use of a useEffect hook to check for changes to both the signature and the messageId. When we have both, we know we can fire off a POST request to our new endpoint with that info plus the user’s wallet address.

That request, if successful, will return the post content and we can update the page.

So where does that leave us? We need to write the code for the last API route, right? Let’s do that now. In the pages/api/[cid].js file, add the following:

import { Submarine } from "pinata-submarine";
const submarine = new Submarine(process.env.SUBMARINE_KEY, process.env.DEDICATED_GATEWAY_URL);
export default async function handler(req, res) {
if (req.method === "GET") {
try {
const message = await submarine.getEVMMessageToSign(process.env.BLOCKCHAIN, process.env.CONTRACT_ADDRESS);
res.json(message);
} catch (error) {
console.log(error);
res.status(500).json(error);
}
} else {
try {
const { signature, messageId, address } = JSON.parse(req.body);
const ownsNFT = await submarine.verifyEVMNFT(
signature,
address,
messageId,
process.env.BLOCKCHAIN,
process.env.CONTRACT_ADDRESS,
process.env.NETWORK
);
if (ownsNFT) {
const { cid } = req.query;
const postData = await submarine.getSubmarinedContentByCid(cid);
const { id, metadata } = postData.items[0]; const link = await submarine.generateAccessLink(1000, id, cid); return res.json({
link,
id,
metadata,
});
} else {
return res.status(401).send("NFT not owned or invalid signature");
}
} catch (error) {
console.log(error);
const { response: fetchResponse } = error;
return res.status(fetchResponse?.status || 500).json(error.data);
}
}
}

In this code, you’ll see the Pinata Submarine SDK is really making our lives easier. In the GET request, we use the SDK to get the message to sign by calling getEVMMessageToSign. This works for all EVM (Ethereum Virtual Machine) based chains. It requires a blockchain name (i.e. “Ethereum”) and the contract address for the NFT you’re trying to verify.

The result of this call is what the front end uses to prompt the signature request. Then, in the verification request, we hit the POST method.

There’s a little more going on in the POST method, but we’re once again saved by the Pinata Submarine SDK. Here, we make use of the verifyEVMNFT method from the SDK. We pass in some parameters including the network the token lives in (i.e. “Mainnet”). The response to this is a simple boolean: true/false.

Next, if the user does own the NFT, we know we can safely generate a link to load in the app and display the content. Then we return that with some additional helper data.

With that, we have everything we need to write our final piece of the app. The post page.

In your components folder, add another file called Post.js. In that file, add the following:

import React from "react";
import Link from "next/link";
const Post = ({ postContent }) => {
return (
<div>
<Link href="/">Back</Link>
<div>
<h1>{postContent.title}</h1>
</div>
<div>
<div dangerouslySetInnerHTML={{ __html: postContent.html }} /
</div>
</div>
);
};
export default Post;

This simple page will render our blog post. You can style this however you’d like (and I recommend you do style it), but the hard work is done. You’ve built a fully functioning NFT-gated blog powered by Ghost and Pinata Submarine.

Here’s what it might look like (with no styling):

Wrapping Up

NFTs are powerful. They are an infinitely scalable accounting and distribution system. They are an authentication system. They open new markets. They can literally be anything. In this tutorial, we leveraged the power of NFT access and Pinata’s IPFS media(public and private) APIs to build a powerful app. And we even used the popular open source blogging app Ghost as our writing front end.

With this knowledge, imagine what else you can build. I can’t wait to see it!

--

--