Stripe is one of the most prominent developer tools for integrating payments into your website or application. The service allows you to start accepting payments from users in 14 countries and 24 currencies, and all this is relatively easy to set up! However, not every business needs a full-fledged website for collecting payments from their customers. In this short tutorial, we'll be building an app on Appsmith that will generate Stripe payment links for you directly from your dashboard. You can create as many payment links as you like and make them available via email. Even if someone doesn't have an Internet connection or uses a computer without a browser installed, they can still take advantage of your services!
Appsmith is an open-source application builder that integrates with custom APIs and databases. It's perfect for building your team's internal tools, admin panels, and dashboards.
Let's dive in!
The first step in building a payment link generator is to set up a Stripe account. You can either create a new account or log in if you're an existing user.
Please note that this application is a built-in test mode, which requires additional information about the business to generate payment links. To make it into a fully-functional application, you will need to add additional details regarding your bank and tax information.
Your dashboard will look like this:
Even in test mode, you will be able to access all the features of Stripe APIs, but this will not be able to make complete transactions from our generated links.
The next step is to make our API requests from Appsmith; we’ll need to copy the secret key that’s available on the main page of the dashboard.
This secret key lets us access our Stripe account via Bearer Token-based authentication.
In the next section, we'll build s simple UI that lets us generate payment links based on the given customer information and payment price.
The first step is to create an account on Appsmith. In this guide, I'll be using the cloud version of Appsmith, but you can always choose to use Appsmith locally or self-host it on your server.
Now, click on the widgets tab and drag and drop a container widget on the canvas; this will allow us to group all the widgets in a container. This can be completely customizable; you could add borders, background colours, shadows, and more by opening the property pane.
Inside the container widget, drag and drop a new form widget and add a few input widgets onto the container that lets us collect information for payment links:
We could also add some additional configuration based on the information that needs to be collected, referring to the Stripe Documentation.
Following is a screenshot of how the UI looks on Appsmith:
Next, let’s create a new datasource, an API endpoint that’ll create a new Stripe payment link.
https://api.stripe.com/v1/checkout/sessions
Authorization - BEARER <token>
To bind the data on the API, we’ll need to use the moustache bindings and the input widgets names. Here’s how we can access the data from the price the amount widget:
Similarly, we add all the required fields on the payload to create a new session. Here’s a screenshot of what the payload looks like:
Our API is now ready; let’s add one more input widget, generating a Stripe Session link (the payment link) for use with the data passed through our input widgets.
Here’s what we’ll need to bind the response from the API endpoint; we can do this by binding the following:
{{stripe_Session.data.url}}
The .data property on an API request will return the response from the API endpoint; here, we’ve accessed the URL field, which is essentially the payment link.
If you open this URL, you’ll see a new Stripe session with the amount and details you’ve entered on the form.
Here’s a recording of how this works:
If you’re interested in using a database not listed on our website as an integration, please let us know about it by raising a PR on Github, and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service [A data warehouse is a type of data management system that is designed to enable and support business intelligence (BI) activities, especially analytics.]. It makes it easy to manage your data warehouse and automatically distributes your data across multiple nodes to eliminate hot spots and provide high availability. It’s also an excellent option for building a frontend. However, it does have a few specifics that might make you rethink your current strategy. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components that is widgets are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice.
This blog will teach you how to build a frontend that can connect to Redshift as a datasource.
On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including Redshift. We need to make the connection to the endpoint, database name, and user credentials. With this in mind, let’s get started.
Note: I’m using a free Redshift account on Amazon Web Services (AWS) in this example.
Here’s what the configuration would look like:
The basic configuration is complete, so now, we will now use the seed data already loaded on the Redshift Data (TICKIT data).
Note: After the connection is established, we can see all the (tables) under the connected datasource.
Now, let’s use the Category table to build our CRUD APP!
First, let’s read our data from the database and display it on a beautiful table widget. Follow the below steps:
We now have our query; let's bind this onto the table widget; for this, follow the below steps:
With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.
To add the create operation on Redshift, let’s make UI.
Here, we have three input widgets to add to our category. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query to create a new category on Redshift.
Follow the steps below:
The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.
Now, rename the column to Edit Category, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. A modal should open up with the necessary fields to update the item when clicked.
Now, copy-paste Modal1, rename it to Modal2 and set the onClick property of the Edit Category button to open Modal2. Here, in the form, we can set the default value to show existing information; to display this, use the selectedRow property from the table widget.
Let’s write the Edit query using SQL:
Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the mustache syntax to bind the data from the widgets onto the query body.
We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editCategory_** under the events property:
The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:
Now, let’s write the Delete query using SQL:
Set the Delete Category button’s onClick property to run the deleteCategory query.
With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.
If you’re interested in using a database not listed on our website as an integration, please let us know about it by raising a PR on Github, and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
Wrapping your mind around how Redis handles data structures and associations like other non-relational database engines can sometimes be challenging. This is particularly true when Redis is compared to more traditional relational databases with quarantined tables containing multiple rows and columns to house data. However, building UI on top and managing multiple queries on the Redis database is also a complicated process.
With Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components, that is, widgets, are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice.
This blog will teach you how to build a frontend that can connect to Redis as a datasource.
Redis is a NoSQL document database built for automatic scaling, high performance, and ease of application development. While the Redis interface has many of the same features as traditional databases, as a NoSQL database, it differs from them in describing relationships between data objects.
On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including Redis.
We need to make the connection to the Host Address, Port, and Database password. With this in mind, let’s get started.
Here’s what the configuration would look like:
We are done with the basic configuration. Now, let’s create some data on Redis and learn a few basic operations on Appsmith.
For the vast majority of data storage with Redis, data will be stored in a simple key/value pair. This can be done using GET and SET commands.
Using this command, we can fetch all the key-value pairs on the Redis datasource. To query this on Appsmith, follow the below steps:
This will fetch all the keys from the data source; the following is the screenshot:
You can use this information to bind it on any widget on Appsmith using the moustache operator; in this case, the data from this query can be accessed by the following snippet:
We may want to store some information from our application on Redis, for example, the title and author of a few of our favourite books.
To do this on Appsmith, you can use Input widgets, to collect the data dynamically or directly create it from the query pane. In the following screenshot, I’ve dragged and dropped two input widgets and a button widget to dynamically take inputs and create key-value pairs on Redis datasource.
Next, I've created a new query on the Redis datasource that will dynamically take inputs from the query, for this:
Here, we have an insert query that collects all the data from the input widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
With this, we can customize and build any kind of UI on top of the Redis datasource.
If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
Low-code-based web applications are often discussed when talking about the future of software development. While some developers call it 'no-code,' others call it 'low code'; no matter what you want to call them, they are showing no signs of slowing down in growth among small to large businesses, especially when this technology is bringing about a great change even on the back-end side of things. One major technology that seems to have worked well in this market is Google's Firebase. It is a platform for app building created by Google. Firebase uses an open-source development framework making it very accessible for developers to swiftly prototype and integrate them into their apps.
With this, Firebase and its database Firestore, most back-end needs are fulfilled without writing code. But you can't build web-based internal applications/admin panels with front-end frameworks simultaneously because building UI from scratch is not easy.
This part, however, can be simplified with Appsmith, where you can create a fully functional and custom front-end in minutes. A vast array of pre-built UI components, that is, widgets, are available to help you build good-looking applications. Connecting data sources with Appsmith takes only a few minutes, and you can quickly build tools on top of the database of your choice.
This blog will teach you how to build a front-end that can connect to Firestore as a datasource.
Firestore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. While the Firestore interface has many of the same features as traditional databases, as a NoSQL database, it differs from describing relationships between data objects.
On Appsmith, it's pretty straightforward to establish a connection with any datasource, including Firestore.
What we need to make the connection are the Database URL, Project Id, and Service Account Credentials. With this in mind, let's get started.
Here’s what the configuration would look like:
Note: For service account credentials, generate a new private key and copy its contents.
We are done with the basic configuration. Now, let's create a collection on Firestore to build a simple to-do list application and learn all the basic CRUD operations on top of Appsmith.
On Firestore it's super easy to do this from the console, just hit the create collection button on the dashboard, and define all the attributes in the model.
Followings are the attributes and data types we use:
Alrighty, our collection is not created; let's get into CRUD.
First, let's read our data from the database and display it on a beautiful table widget. Follow the below steps:
We now have our query; let's bind this onto the table widget; for this, follow the below steps:
With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.
To add the create operation on Firestore, let's make UI.
Here, we have three input widgets to add to our tasks. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new task on Firestore.
Note: The default value is set to the logged-in user name using Appsmith’s context object, you can do this by binding {{appsmith.user.name}} in the default text property.
Follow the steps below:
Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose createTask under the events property:
The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.
Now, rename the column to Edit, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.
Now, copy-paste Modal1 and rename it to Modal2, and set the onClick property of the Edit Task button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.
Let’s write the Edit query:
Note: The {{ Table1.selectedRow._ref }} snippet evaluates to the selected row’s _ref which will be the row we want to edit to.
Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editTask_** under the events property.
The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:
Now, let’s write the Delete query:
With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.
If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
There are many parts to building an app, and designing UI elements can take up most of a developer’s time when building from scratch. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components that is widgets are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice. For example, you can create admin panels to manage product catalogues, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. There are so many possibilities!
In this blog, I will teach you how to build a frontend that can connect to MariaDB as a datasource.
MariaDB Server is one of the most popular open-source relational databases. It’s made by the original developers of MySQL and guaranteed to stay open source. It is part of most cloud offerings and the default in most Linux distributions. It is built upon the values of performance, stability, and openness.
On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including MariaDB; be it on cloud, self-hosted version or local environment.
What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started.
Here’s what the configuration would look like:
We are done with the basic configuration. Now, let’s create a new table and seed it to build a fully customisable CRUD app on MariaDB.
Note: After the connection is established, we can see all the (tables) under the connected datasource.
This is a simple SQL query that’ll create a projects table, the idea for us is to build a simple CRUD application, that’ll let us manage open-source projects.
Also, note that we’ve seeded the table with an insert statement where we added the appsmith project.
Alrighty, not that our table is created, let’s get into CRUD.
First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:
We now have our query; let's bind this onto the table widget; for this follow the below steps:
With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.
To add the create operation on MariaDB, let’s make UI.
Here, we have two input widgets to add to our project. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new project on MariaDB.
Follow the steps below:
Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose createProject under the events property:
The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.
Now, rename the column to Edit Project, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.
Now, copy-paste Modal1, and set the onClick property of the Edit Project button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.
Let’s write the Edit query using SQL:
Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editProject_** under the events property:
The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:
Now, let’s write the Delete query using SQL:
Set the Delete Project button’s onClick property to run the deleteProject query.
With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.
If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
Designing UI elements can take up a majority of a developer’s time when building from scratch. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components, that is widgets are available to help you build good looking applications. Connecting data sources with Appsmith takes a few minutes, and you can easily build tools on top of the database of your choice. For example, you can build admin panels to manage product catalogues, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. The possibilities are countless.
In this blog, I will teach you how to build a frontend that can connect to SnowflakeDB as a datasource.
Snowflake is a cloud-based data warehouse-as-a-cloud-service (SaaS for DB) that requires no hardware or software installation. Snowflake handles the maintenance and tuning of cloud infrastructure. It is based on a new SQL database engine with unique features and advantages over a more traditional data warehousing technology approach.
On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including SnowflakeDB; be it on cloud, self-hosted version or local environment.
What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started.
Here’s what the configuration would look like:
We are done with the basic configuration. Now, let’s use the default database from SnowflakeDB to build a fully customisable CRUD app.
Note: After the connection is established, we can see all the sample data (tables) under the connected datasource.
Now that we have the sample data, in the next section, let’s build a fully-fledged CRUD application (on the customer table) on top of our SnowflakeDB using Appsmith.
First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:
We now have our query; let's bind this onto the table widget; for this follow the below steps:
With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.
To add the create operation on SnowflakeDB, let’s make UI.
Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.
This will open up a new modal now; let’s drag and drop a few widgets to create a form that we can use to add new customers into our database.
Here, we have five input widgets to add our customers. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new customer on SnowflakeDB.
Follow the steps below:
Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose insertCustomer under the events property:
The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.
Now, rename the column to Edit Customer, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.
Now, copy-paste Modal1, and set the onClick property of the Edit Customer button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.
Let’s write the Edit query using SQL:
Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editCustomer_** under the events property.
The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:
Now, let’s write the Delete query using SQL:
Set the Delete Character button’s onClick property to run the deleteCharacter query.
With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.
We’ve got extensive documentation on the SnowflakeDB datasource integration!
If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
A major pain point around building apps is designing the UI elements. Fortunately, with Appsmith, you create a custom frontend in minutes. Connecting datasources with Appsmith takes a few minutes, and you can easily build tools on top of the database of your choice. For example, you can build admin panels to manage product catalogs, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. The possibilities are countless.
In this blog, I will teach you how to build a frontend that can connect to ArangoDB as a datasource.
ArangoDB is a free and open-source native multi-model database system developed by ArangoDB GmbH. The database system supports three data models with one database core and a unified query language AQL. Being multi-model, ArangoDB allows you to run ad-hoc queries on data stored in different models.
On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including ArangoDB; be it on cloud, self-hosted version or local environment.
What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started.
Here’s how the configuration would look like:
We are done with basic configuration. Now, let’s create a collection on ArangoDb and push some data from Appsmith to the database. For this, you’ll need to open the ArangoDB endpoint and use the graphical user interface (GUI).
Let’s name the collection as ‘Characters’ and set the type as ‘Document’
Now let’s seed the collection with some data on Appsmith.
For this, follow the steps below:
Now that we have our seed data, in the next section, let’s build a fully-fledged CRUD application on top of our ArangoDB using Appsmith.
First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:
We now have our query; let's bind this onto the table widget; for this follow the below steps:
With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.
To add the create operation on ArangoDB, let’s make UI.
Here, we have three input widgets, a checkbox, and a multi-select widget to add our characters. We can configure the default values, labels and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new item on ArangoDB.
Follow the steps below:
Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
Lastly, we’ll need to configure the submit button, for this, go back to the modal and set the button’s onClick property to execute a query and choose insertCharacter under the events property:
The Update operation is quite similar to the create operation.
Let’s write the Edit query using AQL:
Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.
Next, configure the submit button, for this, go back to Modal2 and set the button’s onClick property to execute a query and choose editCharacter under the events property.
The delete operation is pretty straightforward with the Table’s selectedRow property, before we dive into it, let’s create a new column on the table and set it to the button. For this:
Now, let’s write the Delete query using AQL:
Set the Delete Character button’s onClick property to run the deleteCharacter query.
With these four operations configured, you will be able to read and analyze information from your database, edit the information, add or delete information and update records.
We’ve got extensive documentation on the ArangoDB datasource integration, along with a video explainer.
If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.
Developers spend quite a bit of time building internal tools, admin panels, and applications for back-office tasks that help automate everyday essential business processes. These involve multiple efforts, from maintaining a special database to writing lots of frontend and backend code. But, what if we told you that you could utilize a modern stack to build such applications that can help with your backend, frontend and automation tasks? Sounds good right? It is!
We’re happy to introduce a great new stack to build applications: The Supabase, Appsmith and n8n stack (SAN Stack) for developers to build and maintain modern custom internal tools.
SAN stands for Supabase, Appsmith and n8n, after the three emerging and notable software that makes up the stack.
Supabase: The open-source firebase alternative to creating a backend in minutes. Start your project with a Postgres database, authentication, instant APIs, real-time subscriptions and storage.
Appsmith: An open-source framework to build custom business software with pre-built UI widgets that connect to any data source, and can be controlled extensively using JavaScript.
n8n: An extendable workflow automation tool. With a fair-code distribution model, n8n will always have visible source code, be available to self-host, and allow you to add your custom functions, logic and apps.
This stack lets you build any application within minutes. You can use Supabase for the database and backend, Appsmith for UI and adding functionality, and n8n for automating background tasks.
One of Appsmith’s co-founders and head of product, Nikhil Nandagopal broke down the basics of app building into three steps.
This has gained quite some traction among developers, especially those looking to build internal tools or applications.
There are so many tools and applications that can be built across the SAN stack. Here are a couple of examples: An Employee Survey Dashboard and a Ticket Management Admin panel.
Using the SAN stack, you can build any dashboard in just minutes.
As an example, I will show you how to create a support dashboard manager application.
Using this application:
Let's get started!
The first step is to set up the backend for the application; for this, we will be using a Postgres database on Supabase.
The DB is now set up; let's use Appsmith to connect this PostgresDB and build UI for the application. For this, we might need to note down the connection info from project settings on Supabase. Following is how it looks like:
Our backend is ready; now, let's connect it to Appsmith to build UI and add functionalities. Follow the below steps:
Awesome, we now have established a connection to our data source. Next, let’s build UI using widgets on Appsmith.
On Appsmith, we can use moustache bindings anywhere across the app to bind data or write javascript code to customize and add functionalities to your widgets.
Fantastic, we should be able to see all the tickets assigned to the specific user! It’s that’s simple to write custom JS to configure our internal applications on Appsmith. Now let’s use a webhook and build automation that sends Emails from the ticket using n8n!
If you want to build an internal tool that requires sending emails, then n8n is the way to go. n8n is a tool that can be used to automate workflows between your favorite web apps (such as Slack, Google Drive, Dropbox, etc.). However, n8n can be used to connect almost any two web apps that you use. Now, let's create a workflow and use a webhook to send requests to n8n from Appsmith.
Awesome, now that we have the Webhook, let’s connect it with Appsmith by adding it as a data source.
And just like that, we should be able to send Emails using n8n without writing any piece of code.
Building this app from scratch, including writing snippets of code, is likely to take 30 minutes! Isn’t that fast?
If you're looking for a modern approach to building internal applications, check out Supabase, Appsmith, and n8n! These tools are straightforward, powerful, and can help you build apps faster than ever. So what are you waiting for? Start building your next internal app today.
While building an application, there are times when you have to use multiple data sources. It could be for security reasons or availability, or even cost issues. Often, connecting and managing those data sources can be challenging! You either need to build a system from scratch to connect those data sources or use a tool.
Let’s say you’re running an analytics dashboard, and you have data coming from multiple sources like MySQL, Redshift, S3, and MongoDB. Would you want your data scientists to focus on making all API connections, connecting all the data sources, and processing them in a specific format? Or, would you instead let them focus on the other critical parts of their job by using an easy solution and connect those data sources through an app?
Building a complete system from scratch is generally not advisable because of the enormous constraints on time. And, teams also have to go through the documentation for each data source to build a system from scratch. They also have to keep track of everything, such as changelogs/updates in the documentation, and change the code base accordingly. Handling this work would mean hiring resources dedicated to this work or putting the team under pressure to manage all this. Using an application to oversee the connection makes sense while ensuring that data sources are secure and reliable.
Building everything from scratch can be a time-consuming part that may also distract you from your path.
To help you stay on course, we’ve put together a nifty list of great tools to help you connect multiple data sources without hassles.
Talend is an open-source data automation tool that can be used to connect multiple data sources. It makes the process of ETL (Extract Transform and Load) pipeline setup more uncomplicated and quicker. It offers a scalable architecture and robust data integration to maximize its value and a suite of open-source tools divided into several components. All combined, they become a potent tool for ETL and connecting multiple data sources.
Talend provides an easy and intuitive way to transform the data. Instead of mapping databases and filling out the forms for different databases, you can just use their graphical tool for mapping and transforming the data.
It also supports data conversion into multiple business formats such as OLAP, Jasper, SPSS, Splunk.
Supported data sources:
Arcesb is another open-source data integration and pipeline setup tool that can connect multiple data sources. It synchronizes the data in real-time, which means as soon as data gets available in the source, it gets publicly available through the Arcesb.
It supports a wide array of protocols such as AS2, AS4, OFTP, SFTP, etc. The drag-and-drop approach makes it easy to connect complex workflows and quickly transform the data in popular formats like JSON, XML, and CSV.
Supported data sources:
Apache Camel is an open-source data integration tool that is a kind of underrated software in ETL. It can be deployed as a standalone application in a web container, connecting complex workflows easily and quickly transformingJEE, OSGi, or even Spring container. It allows programmers to split integration problems into smaller pieces that close the gap between integration and implementation.
It has 3.4k stars on Github and 4.3k Github forks. It supports almost every available protocol like HTTP, HTTP, FTP, JMS, EJB, JPA, RMI, JMS, JMX, LDAP, Netty, and many more. Therefore, it is good to use when you’re dealing with multiple internet protocols. One of the best parts of Apache Camel is that it uses the same workflow for all the supported protocols.
Supported data sources:
Pentaho Kettle, also known as the Pentaho Data Integration tool, supports multiple data integrations, OLAP, data mining, reporting, and ETP capabilities. It is also known for its ease to use and quick learning curve. It allows users to create their data manipulation jobs with a user-friendly graphical creator without entering a single line of code.
It comes with a set of tools that includes:
Hevo is a not-code ETL tool that you can use to connect multiple data sources. It supports more than 100 data sources and follows a 3-step data connection setup across all the data sources: select the data source, enter the correct credentials and select the destination.
It uploads the data to the selected destination and allows you to perform data manipulation and transformations. It also comes with a fault-tolerant system that ensures that the data is being transferred in a secured manner consistently using encryption.
Supported data sources:
We have created Appsmith in such a way that you can easily connect multiple data sources with just a few steps. You can even pass the data through an API using our platform. The drag-and-drop UI makes the connection part very easy and allows you to connect data sources like AWS, MySQL, MongoDB, etc.
The credentials that you store in our application are first encrypted before storing. So, you don’t have to worry about security. Since our platform acts as a proxy layer, we do not store any data from your data sources.
To improve the performance, our platform creates a pool of connections with the database server. This tool allows you to run multiple queries as it is impossible to run multiple queries simultaneously in a single link. We have also created in-depth tutorials that you can check out to understand the benefits of other features as well.
Supported data sources:
Are you interested in building something with Appsmith? Take it for a spin! Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter.
React admin has been one of the holy grail frontend frameworks for building responsive admin panels. It offers a lot of really cool features such as data validation, optimistic rendering, accessibility, and action undo. React-admin is also plug-and-play as it supports standard REST APIs and a handful of GraphQL dialects. Being a Reactjs framework, it also gives you access to thousands of plugins and libraries available in Javascript and the React ecosystem.
In this article, I would like to show you how to build an admin panel using React-admin.
We’re going to be building a dashboard to manage DVD movie rentals for a local rental store. The first page would have a table listing all registered members of the store. The second page will have a table that holds all rental records. From here, new rental entries can be created and existing rentals can be updated i.e from borrowed to returned. We would also be able to click on a customer from the first page and then be taken to the rentals page to see his rental history.
Here’s a gif and a link to the completed application
You can view the demo app here
Dashboard link: as-react-admin.netlify.app
username: cokoghenun@appsmith.com
password: 123456
Through building this dashboard, we’re going to cover core React-admin concepts such as
Since React-admin requires an API server we would need to build one on top of the database. Speaking of the database, we’ll be making use of MongoDB and the demo dataset is a modified version of the Sakila dataset.
To save time and get to the fun part of building the dashboard with React-admin, we’ll be making use of Loopback to generate a Nodejs API over the database. If you are not familiar with Loopback, it is a highly extensible Node.js and TypeScript framework for building APIs and microservices.
You can skip this if you already have an API to use
We’re almost set. But before we begin, I’d like to give you a mini-map of the entire article. The first part of this article will focus on generating an API server over the database on MongoDB using Loopback. The second part of this article would cover how to use React-admin to build a dashboard from the API generated in the first section.
Alright, everything looks good. Let’s get started!
There are many ways to build an API server. You can roll up your sleeves and build one yourself(this takes a lot of time) or you can choose to go with a framework. Loopback is the fastest framework I found to build Nodejs APIs over a database. It supports a host of databases ranging from in-memory to document to relational databases.
The API that would be generated using Loopback will have three resources, the first being the customer resource that represents customers who come to rent DVDs from the store. We also have the film resource, representing DVDs that are in stock. Lastly, we have the rentals resource, which records each rental.
Here’s the schema for each resource
Okay! Now let’s get started by install Loopback CLI with npm
We can easily scaffold the Nodejs server using the Loopback CLI. It configures a Typescript compiler and installs all required dependencies. Let’s run the CLI and answer a few prompts to generate a new app
You should have your app configured as shown below
Hit enter and give the CLI some time to set up the app.
Now that the loopback app has been scaffolded, cd (change directory) into the app folder, and let’s start by creating a model for each resource. A model communicates the shape of each document for a particular resource, much like the schema shown earlier.
Let’s create a model for the customer resource using the Loopback CLI
As we did when generating the app, answer the CLI prompts. Yours should look like this
Great Job! Now, go ahead and do the same for the film and rental resources. Don’t forget that to create a new model, you’ll need to run the lb4 model command.
Next, we’ll need to link the Loopback app to the Mongo database. Loopback provides two entities to help us accomplish this, and they are the datasource and repository mechanisms.
A datasource represents a database connection that would be used to store and retrieve documents from the database i.e MongoDB or PostgreSQL. On the other hand, a repository links a resource on the Loopback app to a particular table or collection in the database. For example, the customer resource is linked to the Customer collection in the database using a repository.
Now, let’s add a datasource to the app, and link it to our MongoDB database. We can easily do this using the CLI command below
As usual, go ahead and answer the CLI prompts, supplying the database credentials to the CLI
Awesome! Now we can add a repository for each resource.
Run the command below and let’s set up a repository for the customer resource. Notice that we have to link the created resource to the target resource, and in this case, it is the customer resource
Cool! Go ahead and do the same for the film and rental repositories. I’m confident you can finish up on your own 😜
Great Job! That was a lot we just covered. Right now, we have models for each resource, a datasource, and repositories linking each model to its respective collection in the database.
The last piece of the puzzle is to add CRUD functionality for each resource.
We can do this by creating controllers. Controllers do the grunt work of creating, reading, updating, and deleting documents for each resource.
As you may have already guessed, we can create a controller using the controller command. Now, let’s create a REST controller for the customer resource. Notice we’ll need to use the model and repository created earlier for the customer resource.
Note that the Id is a string and is not required when creating a new instance
As usual, go ahead and do the same for the film and rental resources.
Awesome! We now have a full-blown REST API that was generated in a few minutes. Open up the project folder in your favorite code editor and you’ll see all the code(and folders) generated by Loopback.
I recommend you change the default port in the index.ts file to something else i.e 4000 because Create React App (used by React-admin) runs by default on port 3000
You can start the server using the start script
You can find a playground and the auto-generated API documentation for your server by visiting the server address on your browser i.e http://localhost:4000/
Alright! Now we have a REST API server with CRUD functionality, we can move on with creating the admin dashboard for using React-admin.
We’ve finally gotten to the fun part, yay!
As a quick recap, we have a Loopback API generated in the last section that serves the customer, film, and rental resource with the following endpoints and data schema
So here’s the game plan. We’re going to use this API to build a dashboard to manage DVD movie rentals. The first page would be a table showing all customers. Then we can click on a customer and view all his rentals on a new page. We can update the return date and status of each rental i.e from borrowed to returned. Lastly, we can view all rentals on the rentals page and create a new entry or edit an existing one.
Phew! Now we can finally begin with React-admin 😅
React-admin is a powerful front-end framework for building admin panels and dashboards. It is highly customizable and has a host of other great features. Since it is based on Reactjs, it can be used with thousands of other Reactjs and Javascript libraries.
React admin requires a base Reactjs project. We are going to be going with Create-React-App (CRA) in this article. So let’s set up the project with CRA
Give the CLI some time to install all dependencies and finish setting up the project. Then, cd into the project directory and go-ahead to install React-admin and the Loopback dataprovider.
A dataProvider is the mechanism with which React-admin communicates with a REST/GraphQL API. The Loopback provider for React-admin enables it to understand and use Loopback APIs i.e how to paginate or filter requests. If you aren’t using a Loopback generated API, you should look into using one of these dataProviders for React-admin.
Open up the project in your favourite code editor and replace everything in the App.js file with the below starter code
So far so good. But we have some new concepts to clear up. In the starter code above, we supply a dataProvider to React-admin which enables it to query the API. The next thing we did up there is to register a resource from the API that we would like to use in React-admin. This is done simply by supplying the endpoint as a name prop to the <Resource> component.
You don’t need to add the forward-slash “/” to the resource name
Going by this rule, we must register it as a resource whenever we need to query a new API endpoint. In this way, React-admin becomes aware of it. Moving on...
The easiest way to view all customers’ info is to have a paginated table displaying all customers’ info. React-admin makes it easy to do this by providing us with a <List> component.
The <List> component generates a paginated table that lists out all documents in a particular resource. We can choose which fields we want to show up on the table by wrapping them in the appropriate <Field> component i.e a date property on a document would be wrapped in a <DateField> component.
The data property on the document is linked to the <Field> component using the source prop. This prop must contain the exact property name. And the field name showing up on the table can be customized using the label prop.
We can also create a filter for the table using the <Filter> component and specify an action to be triggered whenever an item is clicked on the table using the rowClick props on the <Datagrid> component. You can learn more about filtering here and row actions here
Alright! So we want a customer table to show all the customers. We also want this table to be filterable by customer email. Lastly, we want to be able to click on a customer and see all his rentals (we haven’t created the rentals page yet, but we will shortly).
Let’s see all of this in action. Go ahead to create a customer list component with the following content
Next, we need to link the <CustomerList> component with the customer resource component.
Save your code and let’s head over to the browser. You can see we have a nice paginated, and filterable customer table that has been automatically generated and is rendering customer information from the API. Cool right? 😎
Not so fast! Go ahead and create a similar list table for the rental resource. You can name this component RentalList. If you are curious or get stock, feel free to fall back on the code here.
We have two more views to create and they are the edit and create view for the rental resource. They are quite similar to each other and are both similar to the list view but with a few differences.
The edit view would be used to edit an item clicked on the rental table.
To wire up this behaviour ensure that you have rowClick='edit' on the <Datagrid> component in <RentalList>
An edit view uses a <SimpleForm> component, which in reality is a simple form with nested <Input> components. Like with the <Field> components, each <Input> component used is based on the data type of the property to be edited i.e a <TextInput> component is used on a text property. Inputs also require the source props and optional label props as we’ve already seen with the <Field> component.
Bringing it all together, the edit view for the rental resource would look like this:
Notice that some inputs have been disabled using the disabled props
Don’t forget to import and use the edit view in the rental resource component in your App.js file.
Save your files and let’s head to the browser. Click on an order to see the magic!
Okay, so we’ve completed the edit view. Now moving on to make the create view.
The create view is quite similar to the edit view. It’s so similar that I’m just going to paste the code right here and you wouldn’t be able to tell the difference. Just kidding 😜. Anyway, here’s the code for the create view
The only difference here is that we have two select inputs that display a list of all customers and films by manually querying those resources.
Instead of writing custom logic to query the customer and film resources, we could have easily use the built-in <ReferenceInput> component. But currently, there's no way to set the selected value from the <SelectInput> component to something other than the document id. In the create form, we require the email field from the customer resource and the title field from the film resource. That is why we are manually querying, else the <ReferenceInput> component would have been awesome.
Do not forget to import and use the create view we just made. Also, register the film resource in App.js
If a resource is registered and no list view is passed to it, React-admin hides it from the navbar. But the resource is still useful for querying as we did for the film select input in the <RentalCreate> component.
This is the moment you’ve been waiting for! Save your files and head over to the browser. You’ll notice that we now have a create button on the rentals table, and clicking on a rental takes you to edit that rental. Sweet!
We’ve finally completed the dashboard! 🥳 🎉 🎊
We have a complete admin panel to manage rentals. We can see a list of customers, select a customer and view all his orders and lastly, we can create new rental entries or edit existing ones. Awesome!
For some extra credit, let's add some authentication.
You must add some authentication to your apps, else anyone would be able to use it, even malicious individuals! Thankfully, adding authentication to our API and admin dashboard is not too difficult.
The first part of this section will focus on adding authentication to the Loopback API. You can skip this if you’ve been following along with your API. Next, we’ll look at implementing auth on the React-admin panel.
Loopback has various authentication strategies that we can implore to secure the API. We are going to be going with the JWT authentication strategy, mostly because it’s super easy to set up and is fully supported by React-admin.
Enough talk, let's get started by installing the JWT auth extension library and Validatorjs on the Loopback API server.
Next, bind the authentication components to the application class in src/application.ts
Great job! We now have a foundation for auth.
Authentication usually works by validating the credentials of the user attempting to sign in and allowing him to go through if valid credentials are supplied. Thus, we’ll then need to create a user resource to represent a user. For our purposes, a user only has an id and an email field.
Alright, let’s create the user model using the Loopback CLI. Answer the CLI prompts as usual
We’ll also need to create a controller for the user resource that handles all authentication logic. You can use the CLI to generate an empty controller.
Note that this controller would need to be an empty controller and not a REST controller
The generated empty controller file can be found in src/controllers/user.controller.ts. Copy the contents of the file linked here into your controller file. It contains all the authentication logic. You can find the file here
Visit the link above and copy its contents into the user.controller.ts file
Finally, we can secure the customer resource by adding the authentication strategy we just implemented to its controller. Here’s how to do it:
Do the same for the film and rental resources by adding the authentication strategy to their respective controller files.
And that’s it! Visiting the API playground on the browser http://localhost:4000/explorer/ you’ll notice we have a nice green Authorize button at the top of the page. We also now have signup and login routes to create user accounts and log in.
You’ll need to use this playground/explorer to create a new user
Now, let’s use this authentication on the React-admin dashboard.
Implementing authentication on the React-admin dashboard is fairly straightforward. We need an authProvider which is an object that contains methods for the authentication logic, and also a httpClient that adds the authorization header to every request made by the dashboard.
Create an Auth.js file in src/Auth.js that contains the authProvider method, and the httpClient function. Here’s what the content of the file should be
Alright! Now let’s make use of the authProvider and httpClient in our app. Import authProvider and httpClient from ‘Auth.jsintoApp.jsand passhttpClientas a second parameter tolb4Provider. Then add an authProvider prop to theAdmincomponent and pass inauthProvider` as its value.
Simple and easy!
Save the files and head back to the browser and you’ll be greeted with a login screen. Fill in the email and password of your registered user and you’ll be taken to the customers’ table like before.
And that’s it! We now have a super-secured app! 💪
We now have a fully functional admin dashboard with authentication. Lastly, I’ll like to walk you through deployment to your favourite cloud provider.
Since the API generated using Loopback is a standard Nodejs server, you can deploy your app to any Nodejs hosting provider i.e Heroku or Glitch. But note that you will need to move all packages under devDependencies to the dependencies section in your package.json file.
And for the React-admin dashboard, you can deploy it on any static hosting service i.e Netlify or Vercel. Don’t forget to replace the lb4Provider URL with that of your hosted backend.
If you’re dealing with numbers, graphs and charts happen to be the best way to make sense out of those numbers! Charts and graphs can help you understand your data and help you make decisions based on them. With Appsmith, you can connect your data sources in just a few steps and generate beautiful graphs and charts.
As of now, Appsmith supports the following databases (in addition to using any REST APIs):
In this article, we will display data from our MySQL database to our app on Appsmith.
Login to your Appsmith account or create a new account (if you don’t have one). Once you're logged in, click on the “Create New” button:
After that, click on the Build with Drag & Drop widget. You’ll see this:
On the left sidebar, you can see an option for “Datasources”. Click on the “+” button and it will open a list of all supported data sources by Appsmith:
For this blog, I will use the Mock Database (provided to all users of Appsmith to help with quick prototyping), which is based on MySQL; you can go for your preferred database.
NOTE: Since I am using Mock Database, it’s not asking for database credentials. But, when you select another data source, you’ll need to enter the database credentials. Let’s say you want to connect to a new data source. You’ll see the below page:
Once you select the data source (for example, MySQL), you now need to enter your credentials:
Once you have added the database (in this case, selected the data source), you should see the list of all the tables present in your database.
Here’s how it will look:
This allows you to perform CRUD operations against those tables. Let’s try to add our first widget. Now, click on the “+” button in front of Datasources. You should see the list of all connected data sources. Now click on “+ New Query” for the data source you want to run the query. Since we’re using a mock database so we’ll select that:
Once you click the “+ New Query” button, you now need to click on the “Select” option from the list to run the SELECT queries:
This will open a MySQL editor where you need to paste the below query:
Now, to add the widget, we need to add it as a Chart. So, just select the chart widget, which is available at the right sidebar of the page.
Hover the cursor on "public.orders" and click on the “Add” button and select the SELECT command option. Now, paste the below query:
This will return the orders data for the last seven days. We now need to select the chart widget so that the data can be displayed.
Hover the cursor on “public.orders” and click on the “Add” button and select the SELECT command option. Now, paste the below query:
This will return the revenue data for the last seven days. We now need to select the chart widget so that the data can be displayed.
Hover the cursor on “public.employees,” and click on the “Add” button and select the “SELECT” command option. Now, paste the below query:
This will return the employees’ data which includes employee id, employee name, and employee department. We now need to select the chart widget so that the data can be displayed.
Hover the cursor on “public.standup” and click on the “Add” button and select the “SELECT” command option. Now, paste the below query:
This will return the employees’ standup data which includes the date, employee name, and employee notes. We now need to select the chart widget so that the data can be displayed.
Once we’re done with adding and aligning all the widgets, we also need to deploy it; just click on the “Deploy” button at the top of the page including, and it’ll get deployed instantly!
You can check out the live example here also.
Displaying data from the database is very easy with Appsmith. Our plug-n-play UI allows you to add any database and display the data in graphs, charts, tables, dropdown or even as a normal text. You can even add a lot more functionalities in this dashboard like create new orders, create/update employee data, or even perform CRUD operations using our widgets to your database tables.
Testing an API is one of the most important phases of the API development lifecycle. It ensures that the API you’re deploying on the server is bug-free and highly optimized. But the testing phase can be very complex as it involves different types of testing, such as load testing, regression testing, security testing, etc. There are a lot of challenges developers and testing teams face during the testing of API. Let's discuss them first:
Why is API Testing a Difficult Task?
We all know that APIs involve a lot of modules that have a lot of functionalities. Let’s take a simple example of an e-commerce application. You’re going to have many endpoints such as /login, /logout, /cart, /wishlist, /profile, and so on. You need to ensure that each endpoint is delivering what it is supposed to. For example, /cart should only show products associated with a particular profile and shouldn’t mix with other products.
An application like the one mentioned above can have about 300-400 endpoints or sometimes even more! On top of that, you need to make sure that the validations are working fine, response time is low or at least optimized, there’s no bug in the API, and good performance even when 1000s of requests are being made simultaneously. You also need to ensure that the API returns an appropriate status code such as 20x, 40x, 50x, etc. All this makes API testing not only tricky but also a time-consuming task.
To reduce the complexity of the whole testing and deployment process of the API, there are a lot of open-source tools available on the internet (if you prefer non-open-source tools, you can check out Postman or Firecamp ). These tools not only save a lot of time but also give you insights like the response time of the API, among others.
Once you’re done with testing, you can deploy the API on a server. Deployment is the process where you’re ready to go live with the API and need to move it to the live server. Every time you make a change to the API, you need to redeploy the API on the server (after testing, obviously! 😬)
Here's our list of open source tools that you can use to test and deploy your API:
SoapUI is another API tool that allows you to test and deploy your APIs. This is one of the most matured and trusted API testing tools. One of the unique features of this tool is, it supports SOAP APIs too. The tool is mainly used for QA and API testing. It also allows you to connect external data sheets to retrieve data for executions.
Soap UI also allows you to send multiple API requests; triggering a single test case and supports a wide variety of testing such as load testing, functional testing, security testing, etc.
Apache Jmeter is an open-source testing tool that not only tests APIs but scripts too. You can create your own test cases, and it’ll perform different types of testing like module testing, regression testing, etc.
The UI is quite simple and easy to use. You can test the APIs in two ways: either use direct API requests or write a code to make the requests to an API endpoint. The tool is entirely written in Java and supports multiple languages such as Python, C, Java, etc.
It also comes with a marketplace where you can just download the plugins to expand the platform’s functionalities. It supports multiple protocols such as FTP, HTTP, LDAP, SOAP, etc. JMeter also supports graphs and charts, so the results can be visualized easily. To perform UI testing, you can run Selenium test cases as well.
Hoppscotch, previously known as Postwoman, is another popular open-source API development and testing platform. It has a dark UI and a minimalistic design scheme. It is one of the fastest API testing tools allowing you to send requests and copy responses in real-time.
It comes with a variety of themes and you can even install it as a PWA (Progressive Web App) on your mobile device. The tool also lets you make a full-duplex communication channel over a single TCP, in other words you can make Websocket connections. Another big feature of this tool is that you can also test GraphQL queries.
This platform has been developed by Intuit and is used for multiple purposes like API testing, deployment, creating mock test servers, web browser automation, etc. It is written in Java but doesn’t require the end-users to write anything in Java. It’s so easy to use that even non-programmers can write the test cases. It supports YAML as andV, so you can easily use them to write data drives tests. You can also perform cross-browser-based Web UI testing.
Insomnia is another open-source tool that lets you track the development and deployment of API endpoints very easily. It uses a Swagger-inspired editor, so if you’re familiar with it you’ll be able to easily use this tool.
It allows you to import API specs from OpenSpec API as well as Postman. It also comes with a CLI tool called Inso, which lets you go in-depth with the API testing. You can also connect version control software like GitHub, Bitbucket, SVN, etc.
What’s Next?
Now that you’re equipped with your APIs, you can use Appsmith to create full-fledged applications by connecting your data to our extensive repository of pre-built UI widgets like forms, buttons, lists, maps and so much more. And since Appsmith is a GUI based platform, you can drag and drop these widgets to create your applications. You can also invite your colleagues to collaborate with them and then deploy it to be shared with internal or external users.
Psst! You can connect your data on Appsmith either through APIs or through our native integrations with popular databases like Postgres, MongoDB, and Snowflake, among others, as well as apps like Google Sheets!
Also, by running CURL commands directly on the platform, you can test and deploy your apps easily and quickly.
Are you interested in building something with Appsmith? Take it for a spin. Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter!
The database is one of the most critical parts of an application. Why? All the information you receive is stored in the database; the app pulls up that information in the way you want it. So the first step to building an app would be to connect your data. It’s no wonder that databases dominate the world of apps.
Some of the proprietary databases can be expensive and tend to offer limited technologies for data storage; however, you can always choose to go with open-source databases for your next project. You can self-host open-source databases and configure them as you like since the source code is available. Not just this, open-source databases are very flexible as well as cost-effective. There are a lot of applications that use more than one technology for data storage. For example, to deal with real-time data like (for example, data of real-time visitors), MySQL is not a good choice because it is not designed for high concurrency as it takes a lot of time to run multiple queries at the same time. App developers tend to go with a database like MongoDB as it supports a high level of concurrency. However, there could be a situation where the data science team for the same application would probably use MySQL for running complex queries. See how developers can use more than one database technology to connect and perform CRUD operations.
There are many database managers available out there, but only a few of them support multiple database technologies. Any good database manager should be able to support multiple databases and the following features:
Of course, needless to say, the more the merrier! The ones I’ve mentioned above are the minimum standard across leading open-source database managers.
We’ve curated a list of some of the popular database managers for your next project. Dive right in!
Omnidb is an open-source database manager which provides a collaborative environment for developing and managing your database. It comes in two variants: a hosted version and a stand-alone app version.
It’s powered by WebSockets and allows you to run multiple queries on multiple databases in the background efficiently. The application is built keeping simplicity in mind, making it one of the most user-friendly database managers.
It also supports multiple tabs, which allow you to write clean code and make multiple database connections in one session. It also has a smart SQL editor, which comes with linting, auto-complete, beautifies, etc.
HeidiSQL is another open-source database manager which is extremely user-friendly and lets you connect multiple databases. It is one of the most potent database managers out there and enables you to create tables, logs, and manage users on MySQL databases and other database technologies.
This database was initially developed to make connections with MySQL only. However, it was extended to the MS SQL server, and now it also includes PostgreSQL support. HeidiSQL’s UI is pretty clean and allows you to create multiple connections. Once you install it, a setup screen follows, collecting essential information like IP, port, username, and password.
You can also export your data in CSV, Excel, CSV, HTML, SQL, LaTex, Wiki Markup, and PHP array. You can also edit multiple tables together by using the bulk edit option. Not just this, The monitor allows you to kill the costly processes.
RockMongo is an open-source MongoDB GUI tool that is written in PHP and is an entirely web-based application. It looks very similar to PHPMyAdmin and comes with a classic 90s UI (Windows 98 style buttons and a non-responsive layout). It supports all the common standards that make it easy to work with the collections, stats, etc.
Like the ones above, you can connect and store the credentials of multiple MongoDB servers, but it doesn’t support tabs. This means you can only work on a single MongoDB connection at a time.
Another drawback of using this tool is the dependency on the PHP server. To run and execute queries on this tool, you need to install and run a PHP server.
Robo 3T, formerly known as RoboMongo, is another open-source MongoDB GUI client. The application is available for platforms like Ubuntu, Mac, and Windows. It comes embedded with the default MongoDB shell and allows you to run complex queries in a much faster way.
Robo 3T is one of the most popular projects on Github, which means there's an experienced community to help you out. Since it uses the default MongoDB shell, the consumption of resources is likely to be relatively low.
Navicat is another powerful database management and design application that supports multiple drivers and databases. It comes in a standalone application for Mac, Windows, and Linux and allows you to manage drivers like MySQL, MariaDB, SQL Server, SQLite, Oracle & PostgreSQL DB very easily.
This application comes with a lot of functionalities like export to excel, stored procedures, scheduling out of the box, and data transfer. One of the interesting features is data transfer, it allows you to transfer tables from one database to another even if they’re not on the same server.
What’s next?
Once you're done with your database development and design you’ll need a platform to work with the data right? You can use Appsmith and easily connect your databases. Let's say you’re building an API, you can use Appsmith to connect the database and deploy your APIs. Or let's say you want to fetch data from a database and plot a graph using the data. You can very easily use our drag-and-drop widgets to create a graph and deploy it.
Guess what, you can also connect your data on Appsmith either through APIs or through our native integrations with popular databases like Postgres, MongoDB, and Snowflake, among others, as well as apps like Google Sheets!
Are you interested in building something with Appsmith? Take it for a spin. Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter!
Gitlab is eating the world, or so we thought till we moved to GitHub as an opensource company. That should tell you enough about our love for Gitlab, but there is one thorny problem with Gitlab CI that didn't have a solution in any pricing tier. It's the ability to trigger CI jobs with custom parameter values manually. This article will explore the benefits and drawbacks of manual jobs, potential workarounds, and finally, how using Gitlab API and forms, we can get around this problem.
Although CI is primarily aimed to be continuous (I didn't see that coming, did you?), there is still a precious place for manual triggering of jobs in a CI pipeline. This becomes especially apparent when we consider parameters in the build process. For example, you are working on a separate branch that does a PoC of a new feature. You want to ensure that the CI can handle the codebase in this branch but don't want to push your official organization. So, if your build script took parameters (with sane defaults, of course) that can be configured to have the image pushed to your personal organization, you'll be able to test it to your heart's content. What's more, you can even run the Docker image and check all is in order. Basically, you don't break the app.
Another simple example would be when you have a specific set of servers you want to push your changes to from a specific branch in your repo. This is a case of CD, with finer control. You can use the same deploy pipeline but set the branch and server(s) variables and run this job to get the desired outcome. This becomes incredibly useful as the organization scales and has multiple branches, teams, and servers, especially in this age of mono-repos.
If you're a pipeline wizard, you can probably spin off a new script each time for these jobs and run these scripts each time, but a more practical solution would be to use a pipeline template and have a few variables (set defaults for a standard run) that can be populated for manual runs. This is where Gitlab comes a bit short.
I have no idea what I am doing here!
If you're willing to compromise on everything else Gitlab has to offer, Jenkins is your go-to for CI/CD and boy does it ship with an out of the box manual job runner. But of course, who wants to compromise on all the other awesome stuff from Gitlab?
The biggest inconvenience stems from the fact that Jenkins and Gitlab are two separate systems. It's up to the developer to ensure that these two talk to each other well and stay on good terms. For example, If the source code on Gitlab moves from being a Java project to a Golang project, the Jenkins build jobs must be configured to stop using Maven and start using go. Having Gitlab be the integrated solution for both your source code and your CI pipelines is just too convenient. The overheads of running Jenkins and Gitlab will just make your team grumble much more than usual.
Let's say you do end up integrating with Jenkins, are you sure this is enough for you to sacrifice all of Gitlab's CI/CD efficiencies for?
Do you want to integrate this with your repo?
Assuming you haven't ripped apart your Gitlab instance and installed Jenkins, and you're still asking yourself "So how does one trigger manual jobs for Gitlab?", the main (and surprisingly the simplest) suggestion is to build a form for each job and use Gitlab API to trigger it - from the discussion for this issue from four years ago (remember, Google is your friend). Why don't we give that a shot? Sounds simple enough, right?
We're not even getting started here 😰
Voila? 🤔
But a few things to remember before we deploy that in production. You need to update the branch list whenever that changes, which is almost every day. Also, remember to add or remove any parameters along with your build pipeline. Every time you run a manual job, you essentially have to rebuild this file, or scope all potential manual jobs and have files ready. This is really inconvenient.
Beyond just being inconvenient, this method doesn't scale. Even if you maintain this HTML form as part of your source code, it still warrants a significant maintenance effort. You don't want to spend time building/debugging/fixing/maintaining your pipeline-runner, you want to build the software you are actually working on.
Fear not, Appsmith is here. While not an official workaround, appsmith makes it incredibly easy to create your forms and set up the API needed to run manual forms. Building the form will be easier than in Jenkins, API calls can be managed gracefully in the interface, AND you can build a job runner that looks like this in 10 minutes.
We've built a small tutorial to walk you through how we create a manual pipeline to deploy your code on multiple production servers. Let’s get started!
Appsmith is an open-source cloud or self-hosted platform to build admin panels, CRUD apps and workflows. Appsmith helps you speed up application building through fast UI creation, and making you write code only when necessary.
Here’s a video to get you started!
We’ll be building a dashboard on Appsmith to simplify and manage a simple CI/CD workflow that allows developers to deploy software by selecting a specific branch and server.
Additionally, we’ll create options to
This dashboard is going to be created entirely using Appsmith and the Gitlab API.
We’ll be using a Gitlab repository named ci_cd to follow along and set up workflows. It’s a basic “Hello World!” Node.JS application that is deployed to multiple servers on Heroku. We have our .gitlab-ci.yml file configured in such a way as to run two jobs. The first job is to perform a test with the npm test command, and the second is t deploy our app onto Heroku using a ruby gem called dpl.
The entire workflow is configured dynamically so that it can be deployed to any Heroku server given a Heroku <app-name> and an <api-key>. To hide these key’s we’ll be setting these out in our environment variables to the workflow. Lastly, on Heroku, we have two deployment servers appsmith-ci-cd-server1 and appsmith-ci-cd-server2.
Below is a snippet showing the content of .gitlab-ci.yml file.
Now that we have a clear picture of what’s cooking, let’s head to Appsmith and start building our Dashboard. If you haven’t signed up, you can get started for free, here.
Next, let’s head over to our Appsmith dashboard and create a new application by clicking on the Create New button. We’ll be now redirected to a new application named “Untitled Application 1”. Now, let’s give our app a creative name, let’s call it Deploy Dashboard. Our first step is to build a feature that would list all the existing and future workflow history. To do this, let’s utilise Gitlab’s pipeline API, fetch all the details and display it onto the Appsmith table widget.
Let’s go ahead and configure our workflow API Gitlab.
In this section, we’ll be using the GitLab API as a data source to configure it with the Appsmith Deploy Dashboard application. Let’s now create a new API, we can do it by navigating to the API section under Page1 and clicking on Create New option.
Paste the following in the request form and click on Save as Data Source
Next, let’s follow the below steps to display all the existing workflows onto the Appsmith dashboard.
Our Datasource is up and running, now let’s head back to the API section on Appsmith and create a new API to fetch all the existing workflows on the repository. Now create a new API and set the name as get_pipelines. On clicking on the URL we have suggestions to use our previously configured GitLab data source. Use the suggestions and add /pipelines to its end. The URL should now look like the following:
Hit the Run button and you will be greeted with an array of the workflow linked to that repository! Sweet, isn’t it?
Now to make things cooler, let’s build UI to display all these CI/CD workflows on the Appsmith page. Click on the widgets on the navigation bar and drag and drop a table widget onto the canvas. You should see a new table with some pre-populated data. We should also see a floating pane, that consists of all the table properties. Under that pane, edit the Table Data property to the following:
Now, we can see data from the get_pipelines API rendered on the table. You can go ahead to rearrange the column and disable columns that you don’t want showing up on the table i.e sha and updated_at.
Now let’s add a new feature to trigger a new workflow on the same dashboard. To do this, we’ll create a new button by dragging and dropping a button widget. Rename the button to Trigger New Pipeline. Also, drag in a modal widget to the canvas. The button should be configured such that the modal is opened whenever it’s clicked. On the other hand, the modal’s type should be set to form modal and we’ll drag in two dropdowns with corresponding labels into it. The first dropdown should be configured to select a branch with the following options:
Similarly, we configure the second dropdown to show the server options that are configured on Heroku namely, appsmith-ci-cd-server1 and appsmith-ci-cd-server2:
Perfect, we should now see a great looking modal on our Appsmith dashboard.
Let’s head back to the API section and create a new API to trigger a workflow whenever the Confirm button from our modal is clicked. Let’s name this API as create_pipeline and set the value to the following:
Additionally, we have the option to provide variables and ref (meaning the source branch) in the body of this endpoint. We should configure the body as given in the below snippet.
By looking at the above snippet, the ref key is obtained from the branch dropdown, which was previously configured and in the variables section, the value of the heroku_app_name key is obtained from our server dropdown.
"You can find the value of heroku_api_key from your Heroku account under the Settings-> API Keys section."
Lastly, let’s head back to the modal and configure the onclick action of the confirm button to trigger the create_pipeline endpoint. Add the following JS snippet to the onclick property under the button settings.
Kudos! With this, we should be able to trigger a workflow from the Appsmith dashboard itself. Let’s now fine-tune this to have more features in the next section.
Alright, in the last section, we’ll now add fine-grained controls such as deleting, cancelling, retrying, and viewing workflows. These are quite similar to each other, let’s look at the delete workflow option and you can try adding the rest as an exercise :)
Deleting the CI/CD Workflow from Appsmith Dashboard
To implement this feature, let’s add head back to our table and add a new “Custom Colum”. You can find it under the table settings pane. Let’s name this column as delete. To make our dashboard more intuitive, we can set the column type to button and add the label as delete. Now let’s create a delete API under the APIs section and name it as delete_pipeline. Use the following endpoint to add functionality.
This API grabs the id of the selected row from the table, which is automatically set when the delete button is clicked. Heading back to the delete button, let’s configure the onclick action to run the delete_pipeline api and call the get_pipelines api on success to update the table. Here’s the configuration JS snippet:
Perfect, now we have the ability to delete a specific workflow from the Appsmith dashboard itself.
"Disclaimer: Clicking on this button will delete that pipeline for good. As a safety measure, you can add a confirmation modal to prevent accidental delete."
Here’s a quick lookup to configure the other actions to play with our workflows:
Retry Workflow:
Cancel Workflow:
We are glad you made it this far. As a quick summary, here’s a gif to show everything we’ve built so far!
Honestly, that was a lot to take in, but we’ve been able to go through a complete flow of how you can build a custom dashboard to manage your ci/cd process. Also, here's a quick demo of the app that we've built! And we are quite sure you’re bubbling with ideas on the limitless use cases you can build with this. So go into the world and build amazing stuff!
Image Credits: Photo by Ryan Quintal on Unsplash
JUnit is one of the most popular unit testing frameworks used with Java to create repeatable tests. With JUnit each test is written as a separate method inside a Java class. IntelliJ IDEA provides an option to run these test cases from within the IDE.
In case you have a module that communicates with a MySQL database, you can unit test the module by providing it access to a MySQL server running inside a testcontainer. You can also configure this MySQL database instance with your desired username, password and database name (in MySQL server) using the API provided by Testcontainers framework.
In case you use Maven to manage dependencies in your project, as used in Appsmith , you can add the following snippet in your POM file to include all the required packages:
To create a new MySQL testcontainer instance with JUnit 4 you may follow these steps as used in Appsmith's unit test file to test its MySQL plugin:
Please note that Testcontainers framework is different for JUnit4 and JUnit5. Please use the framework as per the JUnit version that you have used. For more details please see Testcontainers page.
Databases spwaned using Testcontainers when run from within the IDE can seem to become inaccessible from outside the IDE. In order to connect to such databases you can uses the database tool that comes with IDEA ultimate version.
Steps to connect to the MySQL database:
1.Add a debug point in the code such that the testcontainer has been brought up at this point.
2. Run the test program using debug mode and wait till it stops on the break point.
3. Click on the database tool.
4. Select your database type.
5. Fetch your credentials. You may read the credentials from the testcontainer using the following API when using with JUnit 4.
6. Test your connection and save credentials.
7. Run query.
It is noteworthy that Testcontainers provide containerized instances of many other popular databases as well, like Postgres, MongoDB and Neo4j. Similarly, IntelliJ IDEA's database tool also provides connectivity support for most of the popular databases. The steps described above, to integrate the testcontainers package or to investigate the containerized database, can be used with databases other than MySQL as well. In summary, the steps to write a unit test using JUnit and any testcontainer can be generalized as follows:
The steps to investigate the containerized database instantiated above can be generalized as follows:
In case you need access to more code examples to see the above steps in usage, do checkout the test files in Appsmith's GitHub repository. I hope this article was useful to you and do share your feedback in comments.
Testcontainers library (Java) is generally used along with Junit test cases to spawn a test environment within a docker container. IntelliJ IDEA provides an option to run these test cases from within the IDE.
Databases spwaned using testcontainers when run from within the IDE can become inaccessible from outside the IDE. In order to connect to such databases you can uses the database tool that comes with IDEA ultimate version.
Steps:
1. Add a debug point in the code such that the testcontainer has been brought up at this point.
2. Run the test program using debug mode and wait till it stops on the break point.
3. Click on the database tool.
4. Select your database type.
5. Enter your credentials. Test your connection and save credentials.
6. Run query.