This article covers the first steps towards the vision of a customized dashboard with easy-to-understand statistics that gives us an overview of our AWS Fundamentals community. Because you have to know your numbers, right?
In the further course we will:
set up a crawler that retrieves the total number of our AWS Fundamentals Newsletter subscribers on a daily basis
display the current number of our subscribers on a dashboard
The Social Stats Dashboard project and the article that comes with it is rather designed for absolute AWS newbies.
That’s why we will focus on the basics and sometimes give a more detailed explanation of these basics.
Technologies & Frameworks
Before we can start to actually code our application, we need to take a step back and think more broadly.
How do we want to access our application?
How do we ensure that all of the desired features interlock?
What infrastructure do we need to run the application?
Are there already helpful frameworks on the market that we could use?
Answer: Our Social Stats Dashboard is a web application that is based on
AWS Lambda for our backend features,
Serverless Stack (SST v3) for our infrastructure and
Next.js for our frontend.
Serverless
Imagine running an application like our Social Stats Dashboard on our own server. We would have a lot of work with managing and maintaining the server although it is just a very simple application.
This is where Serverless (Computing) comes in handy. By using the Serverless Computing approach, all we have to do is focusing on the actual code to build our application while AWS is responsible for provisioning, managing and maintaining the resources to run the code.
AWS Lambda
With AWS Lambda, AWS offers a service that takes care about the server-side resources needed to run our code. Usually, we hand over the code to Lambda in form of so-called Lambda functions. Lambda runs a function only when invoked by its defined trigger (e.g. HTTP requests, scheduled events or other AWS services).
We use AWS Lambda to set up our backend features. The first Lambda function will fetch the total number of subscribers every 24 hours from the Kit API and stores it in a database (Kit has been former known as ConvertKit. It is the tool that we use for email communication). The second Lambda function will return the current number of subscribers from the database to display it on our Social Stats Dashboard.
You may have heard the term Function as a Service (FaaS) a few times. Now you know what this is about.
Serverless Stack (SST)
To understand the benefits of SST, we have to talk about the AWS Cloud Development Kit (AWS CDK) and AWS CloudFormation. CloudFormation is an AWS service that allocates all the resources (like storage, networks or servers) to deliver your application. Usually, you deploy a template to AWS CloudFormation. This template defines all the resources and their properties. But with the AWS CDK, you can use a programming language of your choice instead of templates to configure your infrastructure. Your code is translated into AWS CloudFormation templates behind the scenes. This is also known as Infrastructure as Code (IaC).
Now we can move on to Serverless Stack (SST). SST is a framework based on the concept of Serverless and IaC. It abstracts the configuration of your infrastructure away from you even more than the AWS CDK does. Your entire AWS infrastructure is defined in a single sst.config.ts
file. Using Pulumi (SST v3) or CDK/CloudFormation (SST v2), SST takes care of your entire AWS infrastructure in the background.
Moreover, SST enables you to test and debug your AWS applications locally (npx sst dev
), which can be tricky without SST in most cases. So all in one, SST eases the development of AWS applications and therefore is a great tool for beginners.
With SST as the basic framework for our Social Stats Dashboard, we guarantee the serverless approach for the application. The only thing we have to do regarding our application’s infrastructure is defining a DynamoDB table (database), a Cron job (scheduled event), our Lambda functions (backend features), a Secret (API key handling) and a Next.js Site (frontend with UI) - which are the resources we need to run our application - in the sst.config.ts
file. And then SST does its magic.🪄
Next.js
React is a JavaScript library built to create user interfaces for web applications based on components. Next.js is a framework that eases the development of full stack web applications with React since it provides you with easy-to-handle, fast built-in features for common web development tasks like routing or data fetching.
We use a Next.js frontend to create the user interface for our Social Stats Dashboard without any major effort and access it via a web browser.
Architecture
After choosing our technologies, it is time to think more detailed and create our application’s architecture. Which AWS services do we want to work with? How and when do these services communicate with each other? What kind of information do we need from third party applications.
As shown in the illustration above, the user accesses the Social Stats Dashboard with the current number of the AWS Fundamentals Newsletter subscribers via a simple, non-interactive Next.js site. AWS CloundFront ensures the smooth distribution of our application. This is only listed here for the sake of completeness since SST automatically triggers the distribution without any further activities from our side. The source of information for the number of newsletter subscribers is Kit - a third party application. So our first Lambda function, triggered every day at midnight, is responsible for fetching the current number of subscribers from the Kit API and store it in a DynmoDB table. The DynamoDB table serves our application as a database to store the number of subscribers with a timestamp. To finally display the current number of newsletter subscribers on the Next.js site, a second Lambda function retrieves the latest entry from the DynamoDB table every time the site is (re)loaded.
Building and Testing the Social Stats Dashboard with the SST CLI
Time to finally create our Social Stats Dashboard. SST’s Command Line Interface (CLI) helps us to build, test and deploy our application.
Preconditions:
We ensure that our AWS account is set up with the necessary permissions and our AWS credentials are configured on our CLI (terminal).
The latest version of Node.js and pnpm are properly installed. (pnpm is the package manager of our choice.)
Our terminal is opened to run the following SST commands.
Build a Next.js App:
We create a Next.js project, give it a name and go with all default configurations.
npx create-next-app@latest awsfundamentals-social-stats
Initialize SST for the App:
We direct to the project directory created in step 2 and initialize SST for the project. This adds a
sst.config.ts
file to your Next.js application and runssst install
to install all necessary packages for your providers behind the scenes.npx sst@latest init
❗Make sure to properly install your dependencies.The problem with the
npx sst@latest init
command is that it automatically runssst install
which again uses npm to install the dependencies for your project. So basically it is the same asnpm install
.Since we want to use pnpm instead of npm and don’t want to struggle with the problems the usage of two different package managers in the same project brings along, it is necessary to delete
pnpm-lock.yaml
package-lock.json
node_modules
from your project and run pnpm i
/ pnpm install
to reinstall all your dependencies before changing anything else in your project.
Develop and test the App:
At first, we only want to develop and test our application locally. By running the
npx sst dev
command, our application is deployed in a development (dev) stage. Each process (deployment, backend functions, frontend) runs in a separate window in our terminal, selectable via the side bar. We can access our frontend locally in a browser on http://localhost:3000 now.npx sst dev
As long as
npx sst dev
is running in our terminal, all changes to the code are immediately transferred to the infrastructure, the frontend and the backend. That’s how SST facilitates the testing of applications for us.
Project Structure
Following the steps above to create the Social Stats Dashboard application via the SST CLI, we have basically generated a bunch of files which are structured in the so called drop-in mode. This bunch of files is the SST project we have to work on to get our application running.
awsfundamentals-social-stats
├─ app
├─ ...
├─ page.tsx [Frontend]
├─ functions [Backend]
├─ get_subscribers.ts
├─ sync_subscribers.ts
├─ next.env.d.ts
├─ next.config.mjs
├─ node_modules
├─ ...
├─ pages
├─ api
├─ invoke_lambda.ts
├─ package.json
├─ pnpm-lock.yaml
├─ postcss.config.mjs
├─ README.md
├─ sst-env.d.ts
├─ sst.config.ts [Infrasturcture: Connecting everything🪄]
├─ tailwind.config.ts
└─ tsconfig.json
The app
and node_modules
directories as well as the single files in the root directory, such as sst.config.ts
or package.json
, are added by default in the drop-in-mode and after installing your dependencies properly.
We use the page.tsx
file in the app
directory to define our Next.js frontend. The invoke_lambda.ts
file inside the pages/api
directory defines the API endpoint to invoke our Lambda function via an API Route. Then we add a functions
directory for our backend Lambda functions to the root directory. Frontend and backend features are connected by the sst.config.ts
file which creates the app’s infrastructre.
Infrastructure: SST Components
At first, we work on the code in the sst.config.ts
file. This is where we define the SST components. As shown in the illustration below, every component represents a resource needed for a certain backend or frontend feature of our application.
There are two types of SST components: Low level provider components and high level built-in components. The high level components already include all necessary low level AWS resources for a certain feature. So for now, we only work with the high level components to reduce complexity.
In the SST docs you can find all the high level built-in components for AWS. You can add a component to your application by using the constructors from sst.aws.*
The constructor comes with three arguments:
new sst.aws.*(name, args, opts?)
For our Social Stats Dashboard, we work with five components: a DynamoDB table, a Cron job, a Function, a Next.js Site and a Secret. All of them need to be defined by a constructor within the executing function (run()
) in our sst.config.ts
file.
DynamoDB Table
DynamoDB is AWS’ fully managed NoSQL database service to store any type of data. We use a DynamoDB Table to store the total number of newsletter subscribers fetched from the Kit API daily. Our Dynamo component is named "NewsletterSubscribersTable"
and stored in a variable named table
. It is configured with three args
:
//Create the DynamoDB table to store the total number of newsletter subscribers (from Kit).
const table = new sst.aws.Dynamo("NewsletterSubscribersTable", {
fields: {
id: "string",
timeStamp: "number",
fixedHashKey: "string",
},
primaryIndex: { hashKey: "id"},
globalIndexes: {
TimeStampIndex: { hashKey: "fixedHashKey", rangeKey: "timeStamp"},
}
});
fields
- The rows in a Dynamo DB table are called items. Each item in our Newsletter Subscribers Table again consists of four attributes. Within thefields
argument, we list all the attributes (including their type) which we want to use as an index to query the table later:id
,timeStamp
andfixedHashKey
("ALL_ITEMS"
).💡Items in a DynamoDB Table can consist of more attributes than defined in thefields
argument.Since we don’t want to query our table by the number of subscribers, we don’t need to define a
subscribers
attribute within thefields
argument. Our DynamoDB Table still has asubscribers
attribute. But it is sufficient to introduce this attribute by our backend lambda functions.primaryIndex
- The attribute which is used as unique identifier for the items in a DynamoDB Table is called primary key. The primary key can either consist of one attribute (hash key) or two attributes (hash key and range key). TheprimaryIndex
argument specifies by which primary key you query your table. In our case, we useid
as hash key for our primary index.globalIndexes
- TheglobalIndexes
argument is used to define secondary indexes. You can create secondary indexes to query your table by another key than the one defined as primary index. Whereas a local secondary index (LSI) always has the same hash key as your primary index, a global secondary index (GSI) can be freely chosen. We define a GSI namedTimeStampIndex
with thefixedHashKey
value as hash key and thetimeStamp
value as range key.
So in the end (or after a few testing cycles), our Newsletter Subscribers Table looks something like this:
Secrets
As shown in the application’s architecture further above, we have to fetch the total number of newsletter subscribers from the Kit API. To access personal data from the Kit API, our account is provided with a personal API key.
The Secret component ensures that secret data (like our API key) is stored confidentially inside of AWS. Our Secret component is named "KitApiKey"
and the reference is stored in a variable named secret
:
//Create a Secret for the Kit API key.
const secret = new sst.Secret("KitApiKey");
After defining the Secrets component, we set its value with the SST CLI. This has to happen while sst dev
is running.
npx sst secret set KitApiKey '$api-key-value'
Cron Job
A Cron component automatically executes the tasks with the setup (in particular the schedule) defined in its constructor. That’s why it is the perfect component to run our lambda function sync_subscribers. This lambda functions makes sure that the total number of subscribers is fetched from the Kit API and stored in our Newsletter Subscribers Table every day at midnight. Behind the scenes, the Cron component uses Amazon EventBridge.
Our Cron component is named "SyncTotalSubscribers"
and stored in a variable named cron
. It is configured with two args
:
//Create the cron job to fetch the total number of newsletter subscribers from the DynamoDB Table every day at midnight UTC.
const cron = new sst.aws.Cron("SyncTotalSubscribers", {
schedule: "cron(0/5 * * * ? *)", // The Cron job runs every day at midnight. Actual data: cron(0 0 * * ? *)
job: {
handler: "functions/sync_subscribers.handler",
// Link the DynamoDB table and Secret to the Cron job to grant permissions.
link:[table, secret],
}
});
schedule
- The schedule (obviously) defines when or rather in which frequency a cron job runs. With the cron expressioncron(0 0 * * ? *)
, the sync_subscribers handler function is executed everyday at midnight. (For testing purposes, we use"cron(0/5 * * * ? *)"
to run the cron job every 5 minutes.) Cron expressions for cron-based schedules in Amazon EventBridge have six fields (see AWS Lamda Documentation):
Field No. | Field Description | Value | Meaning in cron(0 0 * * ? *) |
1 (required) | Minutes | 0 - 59 | 0 - First minute of the day. |
2 (required) | Hours | 0 - 23 | 0 - First hour of the day. |
3 (required) | Day of Month | 1 - 31 | * - Every day of the month. |
4 (required) | Month | 1-12 or JAN - DEC | * - Every month of the year. |
5 (required) | Day of Week | 1 - 7 or SUN - SAT | ? - Every day of the 7 days a week. |
6 (optional) | Year | 1970 - 2199 | * - Every year (until 2199). |
job
- Within thejob
argument we point out the lambda function which should be executed when the cron job runs. To sync the Newsletter Subscribers Table with the current number of subscribers, we want to execute the handler method of the sync_subscribers function which is located infunctions/sync_subscribers.handler
. To fetch and store the total number of subscribers, the lambda function needs access to the Kit API and the Newsletter Subscribers Table. We grant this access by linking our Secret component and our DynamoDB table to the cron job (link:[table, secret]
).
Function
Now that our total number of subscribers is properly fetched and stored, we need a Function component to realize the second lambda function get_subscribers. The get_subscribers function retrieves the latest entry from the DynamoDB table to display it on our dashboard in the end. Our Function component is named "GetTotalSubscribers"
and stored in a variable named lambda2
. It is configured with two args
:
//Create the Function component to fetch the total number of newsletter subscribers from the DynamoDB table.
const lambda2 = new sst.aws.Function("GetTotalSubscribers", {
handler: "functions/get_subscribers.handler",
// Link the DynamoDB table to the function to grant permissions.
link: [table],
});
handler
- The lambda function which our Function component shall call is the handler method of the get_subscribers function located infunctions/sync_subscribers.handler
.link
- By linking our DynamoDB table to the Function component, we grand the get_subscribers function access to our Dynamo DB table (same as for the Cron component before).
Nextjs Site
The Nextjs component gives us the opportunity to deploy a Next.js application (rooted in the app
directory) to AWS with only one line of code:
//Create the Next.js app (frontend).
const site = new sst.aws.Nextjs("SocialStatsDashboard", {
link: [lambda2],
});
Our Nextjs component is named "SocialStatsDashboard"
and stored in a variable named site
. By linking our Function component lambda2 to the Nextjs component, we grand the Next.js frontend application permission to invoke the get_subscribers function.
Backend: Lambda Functions
Now that we have defined our resources it is time to create the features we need the resources for. The Social Stats Dashboard has two features and both of them are realized with a lambda function:
Fetch the total number of newsletter subscribers from the Kit API and store them in the Newsletter Subscribers Table. →
functions/sync_subscribers.ts
Retrieve the latest number of newsletter subscriber from the Newsletter Subscribers Table to display it on our dashboard. →
functions/get_subscribers.ts
Before diving deeper into these two lambda functions, we take a look at the general structure of a lambda function. A Lambda function (in Node.js) is defined by a Lambda handler, an event object and a context object (see AWS Lambda documentation):
export const handler = async (event, context) => {
// Add the stuff that the function is supposed to do here.
};
Synchronizing
First step in our sync_subscribers Lambda function is fetching the current number of subscribers from the Kit API. We access the Kit API via axios.get(url[, config])
from the axios library. The right Kit API (v3) endpoint for our purpose is api.convertkit.com/v3/subscribers and we have to hand over the Kit Api Key to allow the axios function to access the Kit API endpoint. Finally, we separate the number of subscribers out of the entire Kit API response and store it in a variable called totalSubscribers
.
//Fetch the number of subscribers from the Kit API.
console.log("Fetching data from Kit API...");
const response = await axios.get(apiUrl, {
params: {
api_secret: Resource.KitApiKey.value,
},
});
const totalSubscribers = response.data.total_subscribers;
console.log("Total subscribers fetched:", totalSubscribers);
To store the value of totalSubscribers
as an item in our Newsletter Subscribers Table, we need to define the values for the remaining attributes. For the id
attribute, we generate a unique, random number with the help of the uuid library. As a Unix timestamp, the timeStamp
attribute represents the number of milliseconds that have elapsed since January 1, 1970. Unix timestamps can be generated with Date.now()
.
// Store the fetched data in a DynamoDB table.
const uniqueId = uuidv4(); //Create a unique random number.
const timestamp = Date.now(); //Create a timestamp. Timestamps are often represented as Unix timestamps, which are the number of milliseconds that have elapsed since January 1, 1970. These timestamps can be stored as numbers in a DynamoDB table.
console.log("Generated unique ID:", uniqueId);
console.log("Current timestamp:", timestamp);
In the last step, we create the new item in our Newsletter Subscribers Table with a DynamoDB client and the PutItemCommand()
from the AWS SDK for Java Script v3. With the PutItemCommand()
, we determine the DynamoDB table in which we want to put the new item as well as the individual values for the four attributes of the item. After its declaration, the PutItemCommand()
is sent to the Dynamo DB client to realize the creation of the new item.
Resource.your-component-name.name
.In our code, we store our Dynamo component in the table
variable and name the component "NewsletterSubscribersTable"
. But this is not the name our Lambda functions use to interact with the table during runtime. SST automatically generates a unique name for resources like DynamoDB tables to avoid naming conflicts across deployments. With TableName: Resource.NewsletterSubscribersTable.name
, we hand over the “runtime name” of the Newsletter Subscribers Table to our to the PutItemCommand()
.
//Initialize a new DynamoDB client and define the PutItemCommand to store the number of subscribers as a new item in the DynamoDB table.
const client = new DynamoDBClient({});
const command = new PutItemCommand({
TableName: Resource.NewsletterSubscribersTable.name,
Item: {
id: { S: uniqueId },
timeStamp: { N: timestamp.toString() },
subscribers: { N: totalSubscribers.toString() },
fixedHashKey: { S: "ALL_ITEMS" },
},
});
//Send the PutItemCommand to the DynamoDB table.
console.log("Sending PutItemCommand to DynamoDB...");
const result = await client.send(command);
return {
statusCode: 200,
body: JSON.stringify({ message: "Data stored successfully." }),
};
If we want to earn some bonus points (🤓), we can also implement a proper error handling with a try…catch structure and error messages. But let’s get our Social Stats Dashboard to work first.
Retrieving Subscribers
The get_subscribers function is based on a DynamoDB client with a QueryCommand()
(from the AWS SDK for JavaScript v3 again). Since we want to search the table for a timestamp between now and 24 hours ago, we generate the corresponding Unix timestamps at first.
//Define the time window for the number of subscribers.
const currentTime = Date.now(); //Timestamps are often represented as Unix timestamps, which are the number of milliseconds that have elapsed since January 1, 1970. These timestamps can be stored as numbers in a DynamoDB table.
const twentyFourHoursAgo = currentTime - 5 * 60 * 1000; // Actual data: 24 * 60 * 60 * 1000
console.log("Current time:", currentTime);
console.log("Twenty-four hours ago:", twentyFourHoursAgo);
Now we can declare the QueryCommand()
:
TableName
,IndexName
- The commands needs to know the (runtime) name of the table as well as the name of the index to query. We want to query the Newsletter Subscribers Table by our global secondary index (GSI) Time Stamp Index.KeyConditionExpression
,ExpressionAttributeNames
,ExpressionAttributeValues
- The key condition expression defines that theQueryCommand()
fetches out all items whose hash key or range key fulfills a certain condition. It uses placeholders for the keys (#GSIhashKey
and#timestamp
in our case) as well as for the reference values (:allitems
,:start
and:end
in our case). Within the expression attribute names, we specify which key from our Dynamo component definition the key placeholders stand for. In the same way, the expression attribute values show us which real reference values the reference value placeholders stand for. So with ourQueryCommand()
, we fetch out all items with the GSI hash key“ALL_ITEMS”
(which are literally all items) AND whose timestamp lies between the current timestamp and the timestamp from 24 hours ago (which means that only one item remains).ProjectExpression
,Select
-"SPECIFIC_ATTRIBUTES"
means that only the attributes specified in the projection expression will be returned. For the get_subscribers function it is sufficient to return the subscribers attribute.
After its declaration, the QueryCommand()
is sent to the Dynamo DB client to realize the query and retrieve an array with the items we are looking for. In our case, the array should only contain one item, since there is only one timestamp between now and 24 hours ago in our DynamoDB table. The Lambda function get_subscribers finally returns this array in a standard JSON format (together with some more information).
//Initialize a new DynamoDB client and define the QueryCommand to query the number of subscribers regarding your time window.
console.log("Fetching data from DynamoDB table...");
const client = new DynamoDBClient({});
const command = new QueryCommand({
TableName: Resource.NewsletterSubscribersTable.name,
IndexName: "TimeStampIndex",
KeyConditionExpression: "#GSIhashKey = :allitems AND #timestamp BETWEEN :start AND :end",
ExpressionAttributeNames: {
"#GSIhashKey": "fixedHashKey",
"#timestamp": "timeStamp",
},
ExpressionAttributeValues: {
":allitems": { S: "ALL_ITEMS" },
":start": { N: twentyFourHoursAgo.toString() },
":end": { N: currentTime.toString() }
},
ProjectionExpression: "subscribers",
Select: "SPECIFIC_ATTRIBUTES",
ScanIndexForward: false,
});
//Send the QueryCommand to the DynamoDB table.
const response = await client.send(command);
console.log("Total subscribers fetched:", response.Items);
return {
statusCode: 200,
body: JSON.stringify({
status: "success",
message: "Query succeeded",
data: response.Items,
}),
};
Here as well you can earn some bonus points by implementing a proper error handling. (🤓)
Frontend: Next.js Site
We are done with our infrastructure components and our backend logic. So only the frontend to display the current number of newsletter subscribers on a web interface is missing for our Social Stats Dashboard. As a reminder: We have already created a Nextjs SST component which deploys a Next.js application rooted in the app
directory to AWS. That’s why we are heading over to the page.tsx
file in the app
directory now.
In our page.tsx
file we work with the “use client” directive and create a SocialMediaCrawler()
function which is executed every time our Next.js site is (re)loaded. The Social Media Crawler mainly consist of two parts:
Retrieve the current number of subscribers by invoking the get_subscribers lambda function.
Display the current number of subscribers on a web interface.
"use client";
export default function SocialMediaCrawler() {
//Part 1: Retrieve current number of subscribers by invoking the get_subscribers lambda function
//Part 2: Display the current number of subscribers on a web interface
}
Part 1: Retrieve Data
We invoke the lambda function get_subscribers every time the Social Media Crawler is executed with the help of an API Route and the useState
/ useEffect
React Hooks. Let’s take a quick look at the API route first. If you want to invoke a Lambda function directly from a Client Component in a Next.js app, you can do so by setting up an API route that acts as a proxy to your Lambda function. Of course it is also possible to directly call the Lambda function from the Client Component. But we choose the proxy-way this time although it definitely is a bit over-engineered. Every file in the pages/api
directory is treated as an API endpoint. Our invoke_lambda endpoint (pages/api/invoke_lambda.ts
) invokes the get_subscribers function through an AWS Lambda service instance and returns the response from the get_subscribers function (JSON formatted statusCode and body) as a JavaScript object.
//Create an instance of the AWS Lambda service client. This instance allows you to interact with AWS Lambda to perform operations such as invoking Lambda functions.
const lambdaService = new AWS.Lambda({
region: "us-east-1"
});
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const params = {
FunctionName: Resource.GetTotalSubscribers.name,
InvocationType: 'RequestResponse',
}
//Send a request to invoke the get_subscribers function. '.promise()' converts the AWS SDK request into a promise, allowing you to use 'await' for asynchronous handling.
const result = await lambdaService.invoke(params).promise();
console.log("Response from Lambda service:", result);
//'result.Payload' is the JSON string that is returned by the get_subscribers function. 'JSON.parse()' converts the JSON string into a JavaScript object, allowing you to access the data returned by the lambda function.
const payload = JSON.parse(result.Payload as string);
console.log("Return value of lambda function:", payload);
res.status(200).json(payload);
}
Back in our pages.tsx
file in the app
directory, we are enabled now to fetch data from the '/api/invoke_lambda'
endpoint with an useEffect Hook. Then we manipulate the fetched data so that we can pass a value of type number to the variable totalSubscribers
. This value of totalSubscribers
(finally🥳) is the current amount of the AWS Fundamentals Newsletter subscribers.
const [totalSubscribers, setTotalSubscribers] = useState(null);
useEffect(() => {
const fetchData = async () => {
console.log("Fetching Data...");
const response = await fetch('/api/invoke_lambda');
const result = await response.json();
const allData = JSON.parse(result.body);
const subscribersObj = allData.data[0].subscribers;
const subscribersCount = subscribersObj.N;
setTotalSubscribers(subscribersCount);
};
fetchData();
}, []);
Part 2: Display Data
The content and structure of the Social Stats Dashboards web interface is defined in the JSX part of our page.tsx
file. {totalSubscribers}
is a placeholder that dynamically inserts the value of the totalSubscribers
variable into the JSX paragraph element. This is how the current number of AWS Fundamentals Newsletter subscribers is displayed on our Social Stats Dashboard whenever the Next.js side is reloaded.
return (
<>
<Head>
<title> Social Stats</title>
</Head>
<div>
<h1>Watch the AWS Fundamentals Community grow☀️</h1>
<p>-- Social Media Crawler --</p>
</div>
<div>
<p>ConvertKit</p>
<p>Newsletter Subscribers: {totalSubscribers}</p>
</div>
</>
);
Summary
At the end of this article, our simple little Social Stats Dashboard looks something like this:
It’s not styled, but it gets the job done! 🎉
By developing the dashboard, we have learned how to work with AWS Lambda to fetch data from a third party application, store it in a Dynamo DB table, pass it to a Next.js application upon request and finally display it dynamically on a web interface. Since we have applied the SST framework, we’ve been able to define our entire infrastructure serverless in our code.