7
June
,
2022
Resources

Building an Inventory Management Tool Using HarperDB

Building an Inventory Management Tool Using HarperDB
Dancia Jos
0
 minutes ↗

This guide explains how to quickly build an Inventory Management tool to manage movie rentals by connecting a HarperDB backend to an Appsmith frontend. It also implements authentication using HarperDB’s role-based access control feature.

HarperDB is a distributed database that supports key-value, document store, and traditional SQL. As a flexible platform for distributed applications and data, HarperDB is read/write optimized and globally replicates data in under 100ms.

You will need a HarperDB account, which can be created by signing in here.

Appsmith is an open-source framework for building internal tools. It lets you drag-and-drop UI components to build pages, connect to any API, database, or GraphQL source and write logic with JavaScript objects

If you don't have an Appsmith account, create one here.

Let’s get started!

Step 1: Set up your Backend on HarperDB

We’ll be using the HarperDB cloud instance free - tier. Click on Create New HarperDB Cloud Instance → Create AWS or Verizon Wavelength Instance

Screenshot_2022-04-27_at_12.41.21_PM.png

Select Choose HarperDB Cloud on AWS and then click on Instance Info

Screenshot_2022-04-27_at_12.41.10_PM.png

Fill in the details about your instance and click on Instance Details button

Screenshot_2022-04-27_at_12.42.16_PM.png

Keep the default settings in the Instance Specs modal and click on Confirm Instance Details

Screenshot_2022-04-27_at_12.43.30_PM.png

Click on the Add Instance button

Screenshot_2022-04-27_at_12.43.58_PM.png

This will begin creating your database cloud instance on HarperDB. Your instance will be up and running in no time.

Screenshot_2022-04-27_at_12.49.01_PM.png

Step 2: Create the Database (Schema)

  • Create a schema called Movies
  • Next, create tables for Customer, Film and Rentals with id as the hash attr. The hash attr in HarperDB is equivalent to a primary key.
Screenshot_2022-04-27_at_1.32.02_PM.png

Use the Upload CSV feature of HarperDB to populate data into the three tables. You can find sample data here.

Screenshot_2022-04-27_at_1.34.49_PM.png

Step 3: Create Roles and Users on HarperDB

Click on the roles tab and create two users named admin_user and rental_user under Standard Roles. admin_user will have all privileges on the three tables so update all the permissions to true

Screenshot_2022-04-27_at_1.50.31_PM.png

rental_user will have all permissions on the Rental table and only the read permission on the Customer and Film tables

Screenshot_2022-04-27_at_2.13.52_PM.png

Under the users tab, create a user named assistant with the role of rental_user.

Screenshot_2022-04-27_at_2.19.01_PM.png

Similarly, create a user named manager with the role of admin_user

And just like that, the backend setup is complete.

Step 4: Build UI on Appsmith

  • On Appsmith, on the Entity Explorer pane, go to PAGES→ Page1 . Rename the page to Login. Drag and drop a Form widget onto the canvas. Add input widgets for Username and Password as shown below. For the Password input widget, choose Password as the Data Type so that the characters entered are masked.
Screenshot_2022-04-28_at_1.45.30_PM.png
  • Create a second page called Rental Inventory which will be the page that the user will be redirected to after login.
  • Drag and drop a Tab widget onto the canvas and add three tabs called Rental, Film, and Customer
Screenshot_2022-04-28_at_2.51.48_PM.png
  • On the Rental tab, add a Table widget and a Form widget to its right which will populate the details of the selected rental record that can be updated. On the Update Rental form, Customer and Movie are Text widgets, Status is a Select Widget, and Returned is a Datepicker Widget.
Screenshot_2022-05-02_at_3.36.53_PM.png
  • Add a modal widget onto the canvas to create a form for adding new rentals. Customer (rentalCustEmail) and Film (filmTitle) are select widgets
Screenshot_2022-05-02_at_4.59.02_PM.png
  • Set this modal to open on the onClick event of the New button
Screenshot_2022-05-02_at_5.31.02_PM.png
  • Create the same layout for the Film Tab. The title is Text widget, Rental Duration, Rental Rate, and Replacement Cost are all Input widgets of Number Datatype.
Screenshot_2022-05-02_at_3.37.52_PM.png
  • Drop a Modal widget on the canvas and create a form to add new movies to the collection. Title and Description are Input widgets of Data Type property set to Text and and the remaining are input widgets of Data Type property set to Number. Set this modal to open on the onClick event of the New button on the Film tab
Screenshot_2022-05-02_at_5.30.35_PM.png
  • Lastly, add widgets to the Customer tab as follows. Search is an input widget (Use the search icon under Icon Options and add “Type to search” under Placeholder in the property pane) and customerList is a List widget with two input widgets for Name and Email.
Screenshot_2022-05-02_at_3.38.41_PM.png
  • Drop a Modal widget on the canvas and create a form to add new customers to the database. All are input widgets. Set this modal to open on the onClick event of the New button on the Customer tab
Screenshot_2022-05-02_at_5.35.14_PM.png

Step 5: Configure the Login page

  • The next step is to create a query to retrieve the JWT tokens when the correct credentials are entered in the login form.
  • On the Login page, click on the + icon next to Queries/JS. Select New Blank API
Screenshot_2022-04-28_at_2.04.24_PM.png
  • On the API pane, rename Api1 to get_token and select the POST method. For the URL endpoint, enter the url of the HarperDB instance. You can copy paste this from your instance card on the HarperDB dashboard.
Screenshot_2022-04-28_at_2.20.06_PM.png

Under Headers, ensure that the content-type is application/json

Screenshot_2022-04-28_at_2.21.01_PM.png
  • Under Body tab of the API pane, enter the following in Raw format. Notice here you are passing the username and password entered in the two input widgets on the form.

{
    "operation": "create_authentication_tokens",
    "username": "{{Input1.text}}",
    "password": "{{Input2.text}}"
}


You can enter the username and password on the Login form and then test run the query to view the response. This query returns two tokens - The operation_token which is used to authenticate all HarperDB operations in the Bearer Token Authorization Header and the refresh_token which is used to generate a new operation_token upon expiry.

Screenshot_2022-05-09_at_7.53.55_PM.png
  • Next, we are going to create a JavaScript function that will redirect the user to the main page (Rental Inventory) of the app based on credentials and their roles.
  • Click on the + icon next to Queries/JS. Select New JS Object. Rename JSObject1 to utils and paste the code snippet below in the JavaScript Editor.


export default {
    login: async () => {
        return get_token.run()
        .then(res => storeValue('user', {username: Input1.text, token: res.operation_token}))
        .then(() => navigateTo('App'))
        .catch(e => showAlert(e.message, 'error'))
    }
}


Note: storeValue is an Appsmith function that is used to store data in key-value pairs that can be accessed later anywhere in the application.

  • Bind the JS Object to the onClick event of the Submit button on the login form.
Screenshot_2022-04-28_at_2.44.06_PM.png

Step 6: Write queries and bind data to widgets

  • Create query to fetch data from Rental table in HarperDB and display it in the Table widget on Appsmith (rentalTable)
  • On the Rental Inventory page, click on the + icon next to Queries/JS. Select New Blank API
  • On the API pane, rename Api1 to get_rental and select the POST method. For the URL endpoint, enter the url of the HarperDB instance as shown in Step 5.
  • Under Headers, add the following key-value pairs. We have to pass the Bearer token generated on successful login under Authorization.
  • Under the Body tab of the API pane, enter the following in Raw format. This is an SQL query to fetch the data from the rental table and we are using server-side pagination to display data in the table widget (rentalTable), hence the limit and offset are set in the query.

{
    "operation": "sql",
    "sql": "SELECT * FROM movies.Rental LIMIT {{rentalTable.pageSize}} OFFSET {{(rentalTable.pageNo - 1) * rentalTable.pageSize}}"
}


  • Bind the data from this query under Table Data property of the rentalTable table using the following snippet


{{get_rental.data}}


  • You will notice the table widget populates with results from the query.
  • Next, we will bind data to the widgets on the Update form to the right of the table. When a user click on the table row, the corresponding details should be displayed on the form.
  • For the Rental Update form, set the widgets in the following manner. Notice that we are manually adding the values in the Status select widget so that the status of the rental can be updated on the form.
CleanShot 2022-06-07 at 13.21.52@2x.png
  • Create a query to update the Rental status and Returned date on this form.
  • On the Rental Inventory page, click on the + icon next to Queries/JS. Select New Blank API
  • On the API pane, rename Api1 to update_rental and select the POST method. Repeat the same steps as above for the URL endpoint and the Headers
  • In the Body tab of the API pane, enter the following under Raw tab

{
    "operation": "update",
    "schema": "movies",
    "table": "Rental",
    "records": [
        {
            "id": {{rentalTable.selectedRow.id}},
            "status": {{status.selectedOptionValue}},
                        "return_date": {{returnedDate.selectedDate}}
        }
    ]
}


  • Hook this query on the onClick event of the Submit button on the Update Rental form. On success of this query also run the get_rental query to repopulate the table widget with updated data.
Screenshot_2022-05-02_at_3.26.36_PM.png
  • Create API query named search_customer with headers same as before with the following code snippet under Body tab of the API pane in Raw format


  {
      "operation": "search_by_value",
      "schema": "movies",
      "table": "Customer",
      "search_attribute": "email",
          "limit": 10,
      "search_value": {{`*${rentalCustEmail.filterText}*`}},
      "get_attributes": [
          "id", "email"
      ]
  }


  • Create another API query named search_film with the following code snippet under Body tab of the API pane in Raw format.

  {
      "operation": "search_by_value",
      "schema": "movies",
      "table": "Film",
      "search_attribute": "title",
          "limit": 10,
      "search_value": {{`*${filmTitle.filterText}*`}},
      "get_attributes": [
          "id", "title"
      ]
  }


  • Create a third query name new_rental to add new rental information with the following code snippet under Body tab of the API pane in Raw format.

  {
      "operation": "upsert",
      "schema": "movies",
      "table": "Rental",
      "records": [
          {
                          "film_title": {{filmTitle.selectedOptionValue}},
                          "rental_date": {{moment().utc()}},
                          "staff_id": {{appsmith.store.user.username == 'assistant'? 2: 1}},
                          "customer_email": {{rentalCustEmail.selectedOptionValue}},
                          "status": "borrowed"
          }
      ]
  }


  • Open the modal by clicking the New button on the Rental tab. On Customer select widget write the following code snippet to populate the customer email ids from the search_customer query.
  •  {{search_customer.data.map(c => ({label: c.email, value: c.email}))}}
Screenshot_2022-05-02_at_5.46.54_PM.png
  • Similarly, populate the Film select widget from the search_film with below code snippet.

{{search_film.data.map(m => ({label: m.title, value: m.title}))}}


  • Set the new_rental query to run on the onClick event of the Confirm button on the modal.
Screenshot_2022-05-02_at_6.01.56_PM.png
  • Repeat the above steps for the Film Tab.
  • The Update Form on the Film tab has the following data bindings
CleanShot 2022-06-07 at 13.26.37@2x.png
  • Create an API query named update_film with the following code snippet under Body tab of the API pane in Raw format. Bind this query to run on the onClick action of the onClick event of the Submit button on the Update Film form.


{
    "operation": "update",
    "schema": "movies",
    "table": "Film",
    "records": [
        {
                    "id": {{filmTable.selectedRow.id}},
                    "rental_duration": {{ufDuration.text}},
                    "rental_rate": {{ufRate.text}},
                    "replacement_cost": {{ufCost.text}}
        }
    ]
}



  • Let’s configure the queries to run to add new film data. Create API query named create_film with headers same as before. Add the following code snippet under Body tab of the API pane, enter the following in Raw


  • Open the modal by clicking the New button on the Film tab. Set the create_film query to run on the onClick event of the Confirm button on the modal. Close the modal and rerun the get_film query to refresh the Film table widget on success of the query.
Screenshot_2022-05-02_at_6.24.53_PM.png
  • Finally on the Customer page, write queries for the search input widget (customerSearchInput) and for displaying data in the list(customerList) below.
  • Create query called get_customers with the following under Body tab of the API pane.

{
    "operation": "search_by_value",
    "schema": "movies",
    "table": "Customer",
    "search_attribute": "email",
        "limit": 10,
    "search_value": "{{`*${customerSearchInput.text}*`}}",
    "get_attributes": [
        "id", "email", "first_name", "last_name"
    ]
}


Bind this query to the onTextChanged action of the customerSearchInput box


{{get_customers.run()}}


  • Bind the results of this query {{get_customers.data}} to the Items property of the customerList box
Screenshot_2022-05-02_at_4.41.35_PM.png
  • Let’s configure the queries to run to add new Customer data. Create API query named new_customer with headers same as before. Add the following code snippet under Body tab of the API pane, and enter the following in Raw

  {
      "operation": "upsert",
      "schema": "movie",
      "table": "customer",
      "records": [
          {
                          "first_name": {{ncFNameInput.text}},
                          "last_name": {{ncLNameInput.text}},
                          "email": {{ncEmailInput.text}}
          }
      ]
  }


  • Open the modal by clicking the New button on the Film tab. Set the new_customer query to run on the onClick event of the Confirm button on the modal. Close the modal on success.
Screenshot_2022-05-02_at_6.22.38_PM.png

Step 7: Control access to information based on user roles

  • On the Rental Inventory page, set the Visibility of the Tab widget to show only if the login process returns tokens on success. Toggle the JS button next to the Visible property to enter the following code snippet

{{!!appsmith.store.user.token}}


Screenshot_2022-05-02_at_2.44.29_PM.png
  • The Rental tab is visible to both the Manager and Assistant roles so we leave its visibility as is
  • The Film and Customer tab should only be visible to the Manager
  • The Visible property of the individual tabs can be accessed by clicking on the gear icon next to the tab name
Screenshot_2022-05-02_at_4.51.56_PM.png
  • Add the following snippet of code in the Visible property of both the tabs

{{appsmith.store.user.username !== 'assistant'}}

Step 8: Configure the Logout button

When the user clicks on the Logout button, we will clear the username and token information in the store variable and redirect the user to the Login page. Add the following snippet of code on the onClick event of the Logout button.


{{storeValue('user', {}); navigateTo('Login')}}


Screenshot_2022-05-02_at_4.46.03_PM.png

So far, we've done a great job. The last part of this project will deploy our application. On the dashboard's top right section, click on the 'Deploy' button. You can also deploy this application by connecting to a Git repository. Our application is now deployed!

If you're interested in using a database not listed on our website as an integration, please let us know about it by raising a PR on Github, and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

Build a Payment Link Generator App with Stripe APIs
29
April
2022
Resources

Build a Payment Link Generator App with Stripe APIs

Build a Payment Link Generator App with Stripe APIs
Vihar Kurama
0
 minutes ↗
#
stripe
#
guide
#
dashboard
Resources

Stripe is one of the most prominent developer tools for integrating payments into your website or application. The service allows you to start accepting payments from users in 14 countries and 24 currencies, and all this is relatively easy to set up! However, not every business needs a full-fledged website for collecting payments from their customers. In this short tutorial, we'll be building an app on Appsmith that will generate Stripe payment links for you directly from your dashboard. You can create as many payment links as you like and make them available via email. Even if someone doesn't have an Internet connection or uses a computer without a browser installed, they can still take advantage of your services!

Appsmith is an open-source application builder that integrates with custom APIs and databases. It's perfect for building your team's internal tools, admin panels, and dashboards.

Let's dive in!

Setting up Stripe Account

The first step in building a payment link generator is to set up a Stripe account. You can either create a new account or log in if you're an existing user.

Please note that this application is a built-in test mode, which requires additional information about the business to generate payment links. To make it into a fully-functional application, you will need to add additional details regarding your bank and tax information.

Your dashboard will look like this:

CleanShot 2022-04-20 at 15.30.29@2x.png

Even in test mode, you will be able to access all the features of Stripe APIs, but this will not be able to make complete transactions from our generated links.

The next step is to make our API requests from Appsmith; we’ll need to copy the secret key that’s available on the main page of the dashboard.

CleanShot 2022-04-20 at 15.36.24@2x.png

This secret key lets us access our Stripe account via Bearer Token-based authentication.

In the next section, we'll build s simple UI that lets us generate payment links based on the given customer information and payment price.

Building UI on Appsmith

The first step is to create an account on Appsmith. In this guide, I'll be using the cloud version of Appsmith, but you can always choose to use Appsmith locally or self-host it on your server.

  • Navigate to appsmith.com and sign-up for a new account if you're a new user or login into the existing one.
  • Create a new application under your preferred organization. You'll see an editor with everything you need to build your internal application.
  • As soon as you create a new app, you'll see a canvas with all the details around widgets and data sources on the left sidebar.

Now, click on the widgets tab and drag and drop a container widget on the canvas; this will allow us to group all the widgets in a container. This can be completely customizable; you could add borders, background colours, shadows, and more by opening the property pane.

Inside the container widget, drag and drop a new form widget and add a few input widgets onto the container that lets us collect information for payment links:

  • Product Name
  • Price
  • Quantity
  • Success URL
  • Capture Method

We could also add some additional configuration based on the information that needs to be collected, referring to the Stripe Documentation.

Following is a screenshot of how the UI looks on Appsmith:

CleanShot 2022-04-20 at 16.39.52@2x.png

Next, let’s create a new datasource, an API endpoint that’ll create a new Stripe payment link.

  • Click on the + icon next to Datasources from the sidebar
  • Choose API Endpoint and paste the following URL:

https://api.stripe.com/v1/checkout/sessions

  • You can rename this URL by just double-clicking on the existing one; let’s call this stripe-session.
  • Stripe APIs use BEARER token-based authentication; hence, the API expects an Authorization header with a bearer token.
  • Copy the token from the Stripe dashboard and paste it into the header.

Authorization - BEARER <token>

  • Lastly, let’s send the data as a payload using the filling FORM_URLENCODED data since we are collecting all the inputs in a form widget. Alternatively, we could also add the payload in the JSON Body filed.
To bind the data on the API, we’ll need to use the moustache bindings and the input widgets names. Here’s how we can access the data from the price the amount widget:

{{amountInput.text*100}}


Similarly, we add all the required fields on the payload to create a new session. Here’s a screenshot of what the payload looks like:

CleanShot 2022-04-25 at 21.43.03@2x.png

Our API is now ready; let’s add one more input widget, generating a Stripe Session link (the payment link) for use with the data passed through our input widgets.

Here’s what we’ll need to bind the response from the API endpoint; we can do this by binding the following:

​​{{stripe_Session.data.url}}


The .data property on an API request will return the response from the API endpoint; here, we’ve accessed the URL field, which is essentially the payment link.

If you open this URL, you’ll see a new Stripe session with the amount and details you’ve entered on the form.

Here’s a recording of how this works:

If you’re interested in using a database not listed on our website as an integration, please let us know about it by raising a PR on Github, and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

Build a CRUD App with a Redshift Database
1
March
2022
Resources

Build a CRUD App with a Redshift Database

Build a CRUD App with a Redshift Database
Vihar Kurama
0
 minutes ↗
#
admin
#
ui
#
mysql
#
app-development
#
guide
Resources

Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service [A data warehouse is a type of data management system that is designed to enable and support business intelligence (BI) activities, especially analytics.]. It makes it easy to manage your data warehouse and automatically distributes your data across multiple nodes to eliminate hot spots and provide high availability. It’s also an excellent option for building a frontend. However, it does have a few specifics that might make you rethink your current strategy. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components that is widgets are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice.

This blog will teach you how to build a frontend that can connect to Redshift as a datasource.

Connecting Redshift on Appsmith

On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including Redshift. We need to make the connection to the endpoint, database name, and user credentials. With this in mind, let’s get started.

  • Create a new account on Appsmith (it’s free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Next, click on the Now, navigate to the Create New tab and choose Redshift datasource; you’ll see the following screenshot:
CleanShot 2022-02-22 at 13.51.39@2x.png
  • All these details can be found under the configuration settings when using Redshift on a hosted service.

Note: I’m using a free Redshift account on Amazon Web Services (AWS) in this example.

CleanShot 2022-02-22 at 13.52.25@2x.png
  • Rename the Datasource to Redshift CRUD by double-clicking on the existing one.

Here’s what the configuration would look like:

CleanShot 2022-02-22 at 14.23.12@2x.png
  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. Hit the ‘Save’ button to establish a secure connection between Appsmith and Redshift if it returns a successful message.

Using Redshift’s Seed Data

The basic configuration is complete, so now, we will now use the seed data already loaded on the Redshift Data (TICKIT data).

Note: After the connection is established, we can see all the (tables) under the connected datasource.

CleanShot 2022-02-22 at 14.26.50@2x.png

Now, let’s use the Category table to build our CRUD APP!

CRUD on Redshift with Appsmith

Implementing the Read Operation

First, let’s read our data from the database and display it on a beautiful table widget. Follow the below steps:

  • Click on the + icon next to the datasources and click on create New + from the Redshift CRUD datasource.
  • Rename the query to getCategories
  • Copy the following SQL script to query all the Categories from the category table:


SELECT * FROM public."category" LIMIT 10;


  • This simple query returns all the category details present in the sample data item. Hit the RUN button to view all the results.
CleanShot 2022-02-22 at 15.07.07@2x.png

We now have our query; let's bind this onto the table widget; for this, follow the below steps:

  • Click the + icon next to widgets on the sidebar, search for the table widget, and drag and drop it onto the canvas.
  • You can make any configurations to any widget via the property pane. Click on the table widget on the canvas; you will see the property pane docked to the sidebar on the right. Now, under the Table Data property, use the moustache syntax to bind the query:
{{getCategories.data}}


With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.

CleanShot 2022-02-22 at 15.07.42@2x.png
Implementing the Create Operation

To add the create operation on Redshift, let’s make UI.

  • Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.
  • This will open up a new modal now; let’s drag and drop a few widgets to create a form that we can use to add new Categories into our database.
CleanShot 2022-02-22 at 15.09.07@2x.png

Here, we have three input widgets to add to our category. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query to create a new category on Redshift.

Follow the steps below:

  • Click on the + icon next to the datasources and choose to Create New + from the Redshift CRUD datasource
  • Rename the query to addCategory
  • Copy the following SQL script:
INSERT INTO public."category" ("catgroup", "catname", "catdesc")  VALUES ('{{Input1.text}}', '{{Input2.text}}', '{{Input3.text}}');


CleanShot 2022-02-22 at 15.10.54@2x.png
Implementing the Update Operation

The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.

Now, rename the column to Edit Category, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. A modal should open up with the necessary fields to update the item when clicked.

Now, copy-paste Modal1, rename it to Modal2 and set the onClick property of the Edit Category button to open Modal2. Here, in the form, we can set the default value to show existing information; to display this, use the selectedRow property from the table widget.

CleanShot 2022-02-22 at 15.15.53@2x.png

Let’s write the Edit query using SQL:

  • Click on the + icon next to the datasources and choose to Create New + from the Redshift CRUD datasource
  • Rename the query to editCategory
  • Copy the following SQL script:

UPDATE public."category" SET
    "catgroup" = '{{Input1Copy.text}}'
    "catname" = '{{Input2Copy.text}}'
    "catdesc" = '{{Input3Copy.text}}'
  WHERE "catid" = {{Table1.selectedRow.catid}};

Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the mustache syntax to bind the data from the widgets onto the query body.

We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editCategory_** under the events property:

Implementing the Delete Operation

The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:

  • Create a new custom column on the table by clicking on Add a New Column under the columns property.
  • Now, rename this column to ‘Delete Category,’ and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a button type.

Now, let’s write the Delete query using SQL:

  • Click on the + icon next to the data sources and choose the Create New + from the Redshift CRUD datasource
  • Rename the query to deleteCategory
  • Copy the following SQL script:
DELETE FROM public."category"  
    WHERE catid = {{Table1.selectedRow.catid}};


Set the Delete Category button’s onClick property to run the deleteCategory query.

With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.

If you’re interested in using a database not listed on our website as an integration, please let us know about it by raising a PR on Github, and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

A Simple Front-end for Your Redis Datasource
21
February
2022
Resources

A Simple Front-end for Your Redis Datasource

A Simple Front-end for Your Redis Datasource
Vihar Kurama
0
 minutes ↗
#
dashboard
#
databases
#
developer
Resources

Wrapping your mind around how Redis handles data structures and associations like other non-relational database engines can sometimes be challenging. This is particularly true when Redis is compared to more traditional relational databases with quarantined tables containing multiple rows and columns to house data. However, building UI on top and managing multiple queries on the Redis database is also a complicated process.

With Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components, that is, widgets, are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice.

This blog will teach you how to build a frontend that can connect to Redis as a datasource.

Redis is a NoSQL document database built for automatic scaling, high performance, and ease of application development. While the Redis interface has many of the same features as traditional databases, as a NoSQL database, it differs from them in describing relationships between data objects.

Getting Started: Connecting Redis on Appsmith

On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including Redis.

We need to make the connection to the Host Address, Port, and Database password. With this in mind, let’s get started.

  • Create a new account on Appsmith (it’s free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Next, click on the Now, navigate to the Create New tab and choose Redis datasource; you’ll see the following screenshot:
CleanShot 2022-02-10 at 14.07.04@2x.png
  • When using Redis, all these details can be found under the REDIS APP console under project settings.
CleanShot 2022-02-10 at 14.08.45@2x.png
  • Rename the Datasource to Redis CRUD by double-clicking on the existing one.

Here’s what the configuration would look like:

CleanShot 2022-02-10 at 14.09.17@2x.png
  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. If it returns a successful message, hit the Save button to establish a secure connection between Appsmith and Redis.

Creating Key-Values on Redis

We are done with the basic configuration. Now, let’s create some data on Redis and learn a few basic operations on Appsmith.

For the vast majority of data storage with Redis, data will be stored in a simple key/value pair. This can be done using GET and SET commands.

Using GET on Redis

Using this command, we can fetch all the key-value pairs on the Redis datasource. To query this on Appsmith, follow the below steps:

  • Click on the + icon next to the datasources and choose to Create New + from the Redis CRUD datasource
  • Rename the query to getKeys
  • Paste the following command under the Query tab:

Keys *


This will fetch all the keys from the data source; the following is the screenshot:

CleanShot 2022-02-15 at 02.37.11@2x.png

You can use this information to bind it on any widget on Appsmith using the moustache operator; in this case, the data from this query can be accessed by the following snippet:


{{ getKeys.data }}


Using SET on Redis

We may want to store some information from our application on Redis, for example, the title and author of a few of our favourite books.

To do this on Appsmith, you can use Input widgets, to collect the data dynamically or directly create it from the query pane. In the following screenshot, I’ve dragged and dropped two input widgets and a button widget to dynamically take inputs and create key-value pairs on Redis datasource.

CleanShot 2022-02-16 at 14.29.18@2x.png

Next, I've created a new query on the Redis datasource that will dynamically take inputs from the query, for this:

  • Click on the + icon next to the datasources and choose to Create New + from the Redis CRUD datasource
  • Rename the query to createKeys
  • And paste the following command:

SET {{Input1.text}} {{Input2.text}}

Here, we have an insert query that collects all the data from the input widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

With this, we can customize and build any kind of UI on top of the Redis datasource.

If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

Build a CRUD App with a Firestore Database
14
February
2022
Resources

Build a CRUD App with a Firestore Database

Build a CRUD App with a Firestore Database
Vihar Kurama
0
 minutes ↗
#
firebase
#
crud
#
dashboard
#
javascript
Resources

Low-code-based web applications are often discussed when talking about the future of software development. While some developers call it 'no-code,' others call it 'low code'; no matter what you want to call them, they are showing no signs of slowing down in growth among small to large businesses, especially when this technology is bringing about a great change even on the back-end side of things. One major technology that seems to have worked well in this market is Google's Firebase. It is a platform for app building created by Google. Firebase uses an open-source development framework making it very accessible for developers to swiftly prototype and integrate them into their apps.

With this, Firebase and its database Firestore, most back-end needs are fulfilled without writing code. But you can't build web-based internal applications/admin panels with front-end frameworks simultaneously because building UI from scratch is not easy.

This part, however, can be simplified with Appsmith, where you can create a fully functional and custom front-end in minutes. A vast array of pre-built UI components, that is, widgets, are available to help you build good-looking applications. Connecting data sources with Appsmith takes only a few minutes, and you can quickly build tools on top of the database of your choice.

This blog will teach you how to build a front-end that can connect to Firestore as a datasource.

Firestore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. While the Firestore interface has many of the same features as traditional databases, as a NoSQL database, it differs from describing relationships between data objects.

Getting Started: Connecting Firestore on Appsmith

On Appsmith, it's pretty straightforward to establish a connection with any datasource, including Firestore.

What we need to make the connection are the Database URL, Project Id, and Service Account Credentials. With this in mind, let's get started.

  • Create a new account on Appsmith (it's free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Next, click on the Now, navigate to the Create New tab and choose Firestore datasource; you'll see the following screenshot:
CleanShot 2022-02-10 at 11.07.25@2x.png
  • When using Firestore, all these details can be found under the firebase console under project settings.
CleanShot 2022-02-10 at 11.09.10@2x.png
  • Rename the Datasource to Firestore CRUD by double-clicking on the existing one.

Here’s what the configuration would look like:

CleanShot 2022-02-10 at 11.10.26@2x.png
Note: For service account credentials, generate a new private key and copy its contents.
  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. If it returns a successful message, hit the 'Save' button to establish a secure connection between Appsmith and Firestore.

Creating a new Table on Firestore

We are done with the basic configuration. Now, let's create a collection on Firestore to build a simple to-do list application and learn all the basic CRUD operations on top of Appsmith.

On Firestore it's super easy to do this from the console, just hit the create collection button on the dashboard, and define all the attributes in the model.

Followings are the attributes and data types we use:

  • name: string
  • assigned_to: string
  • complete: boolean
  • deadline: datetime
CleanShot 2022-02-10 at 11.13.50@2x.png

Alrighty, our collection is not created; let's get into CRUD.

CRUD on Firestore with Appsmith

Implementing the Read Operation

First, let's read our data from the database and display it on a beautiful table widget. Follow the below steps:

  • Click on the + icon next to the datasources and choose Create New + from the Firestore CRUD datasource.
  • Rename the query to getTasks
  • Set the query command to Get Documents in Collection
  • Set the Collection/Document Path * to collection name, which in our case is tasks
  • This simple query returns all the task details in the sample data item. Hit the RUN button to view all the results.
CleanShot 2022-02-10 at 11.20.32@2x.png

We now have our query; let's bind this onto the table widget; for this, follow the below steps:

  • Click the + icon next to widgets on the sidebar, search for the table widget, and drag and drop it onto the canvas.
  • You can make any configurations to any widget via the property pane. Click on the table widget on the canvas; you will see the property pane docked to the sidebar on the right. Now, under the Table Data property, use the moustache syntax to bind the query:
{{getTasks.data}}


With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.

CleanShot 2022-02-10 at 11.21.46@2x.png
Implementing the Create Operation

To add the create operation on Firestore, let's make UI.

  • Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.
  • This will open up a new modal now; let's drag and drop a few widgets to create a form that we can use to add a new task into our database.
CleanShot 2022-02-10 at 11.27.34@2x.png

Here, we have three input widgets to add to our tasks. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new task on Firestore.

Note: The default value is set to the logged-in user name using Appsmith’s context object, you can do this by binding {{appsmith.user.name}} in the default text property.

Follow the steps below:

  • Click on the + icon next to the datasources and choose to Create New + from the Firestore CRUD datasource
  • Rename the query to createTask
  • Set the commands to Create Document
  • Set the Collection/Document Path *to tasks/{{Math.random().toString(36).substring(7)}}
  • Finally, set the body property to:
{
    "name":"{{Input1.text}}",
    "deadline": "{{DatePicker1.formattedDate}}",
    "assigned_to": "{{Input2.text}}",
    "complete": false
}


Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose createTask under the events property:

CleanShot 2022-02-10 at 11.30.39@2x.png
Implementing the Update Operation

The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.

Now, rename the column to Edit, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.

Now, copy-paste Modal1 and rename it to Modal2, and set the onClick property of the Edit Task button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.

CleanShot 2022-02-15 at 01.48.15@2x.png

Let’s write the Edit query:

  • Click on the + icon next to the datasources and choose to Create New + from the Firestore CRUD datasource
  • Rename the query to editTask
  • Set the Collection/Document Path* to tasks/{{Table1.selectedRow._ref}}
  • Finally, set the body property to:
{
    "name":"{{Input1Copy.text}}",
    "deadline": "{{DatePicker1Copy.formattedDate}}",
    "assigned_to": "{{Input2Copy.text}}",
    "complete": false
}


Note: The {{ Table1.selectedRow._ref }} snippet evaluates to the selected row’s _ref which will be the row we want to edit to.

Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editTask_** under the events property.

Implementing the Delete Operation

The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:

  • Create a new custom column on the table by clicking on Add a New Column under the columns property.
  • Now, rename this column to ‘Delete Task,’ and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a button type.

Now, let’s write the Delete query:

  • Click on the + icon next to the data sources and choose the Create New + from the Firestore CRUD datasource
  • Set the commands to Delete Document
  • Set the Collection/Document Path* toSet the Delete Task button’s onClick property to run the deleteTask query.
tasks/{{Table1.selectedRow._ref}}


With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.

If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

A simple front-end for your MariaDB datasource
11
February
2022
Resources

A simple front-end for your MariaDB datasource

A simple front-end for your MariaDB datasource
Vihar Kurama
0
 minutes ↗
#
crud
#
mysql
#
databases
#
guide
Resources

There are many parts to building an app, and designing UI elements can take up most of a developer’s time when building from scratch. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components that is widgets are available to help you build good-looking applications. Connecting data sources with Appsmith takes a few minutes, and you can quickly build tools on top of the database of your choice. For example, you can create admin panels to manage product catalogues, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. There are so many possibilities!

In this blog, I will teach you how to build a frontend that can connect to MariaDB as a datasource.

MariaDB Server is one of the most popular open-source relational databases. It’s made by the original developers of MySQL and guaranteed to stay open source. It is part of most cloud offerings and the default in most Linux distributions. It is built upon the values of performance, stability, and openness.

Getting Started: Connecting MariaDB on Appsmith

On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including MariaDB; be it on cloud, self-hosted version or local environment.

What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started.

  • Create a new account on Appsmith (it’s free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Next, click on the Now, navigate to the Create New tab and choose MySQL datasource, you’ll see the following screenshot:
CleanShot 2022-02-03 at 00.54.54@2x.png
  • When you’re using MariaDB on a hosted service, all these details can be found under the configuration settings. For this example, I’m using a free MariaDB RDS on Amazon Web Services (AWS).
CleanShot 2022-02-03 at 00.56.52@2x.png
  • Rename the Datasource to MariaDB CRUD by double-clicking on the existing one.

Here’s what the configuration would look like:

CleanShot 2022-02-03 at 00.57.58@2x.png
  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. If it returns a successful message, hit the ‘Save’ button to establish a secure connection between Appsmith and MariaDB.

Creating a New Table on MariaDB

We are done with the basic configuration. Now, let’s create a new table and seed it to build a fully customisable CRUD app on MariaDB.

Note: After the connection is established, we can see all the (tables) under the connected datasource.

  • Click on the + icon next to Datasource and click on Create New + from the MYSQL CRUD we’ve just created.
  • This will redirect you to the query pane, where you can write MariaDB SQL queries, now rename the query to seedData, and choose to create from the template and paste the following in the query body:
create table projects(
    project_id int auto_increment,
    project_name varchar(255) not null,
    created_at timestamp default current_timestamp,
    primary key(project_id)
);

INSERT INTO projects (project_name, created_at)
  VALUES ('appsmith', '2022-02-02T19:11:23');


This is a simple SQL query that’ll create a projects table, the idea for us is to build a simple CRUD application, that’ll let us manage open-source projects.

Also, note that we’ve seeded the table with an insert statement where we added the appsmith project.

Alrighty, not that our table is created, let’s get into CRUD.

CRUD on MariaDB with Appsmith

Implementing the Read Operation

First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:

  • Click on the + icon next to the datasources and choose to Create New + from the MariaDB CRUD datasource.
  • Rename the query to getProjects
  • Copy the following SQL script to query all the Projects from the projects table:
SELECT * FROM projects LIMIT 10;


  • This is a simple query that returns all the projects present in the sample data item. Hit the RUN button to view all the results.
CleanShot 2022-02-03 at 01.07.32@2x.png

We now have our query; let's bind this onto the table widget; for this follow the below steps:

  • Click the + icon next to widgets on the sidebar, search for the table widget, and drag and drop it onto the canvas.
  • Configurations to any widget can be made through the property pane. Click on the table widget on the canvas, you will see the property pane docked to the sidebar on the right. Now, under the Table Data property, use the moustache syntax to bind the query:
{{getProjects.data}}


With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.

CleanShot 2022-02-03 at 01.08.54@2x.png
Implementing the Create Operation

To add the create operation on MariaDB, let’s make UI.

  • Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.
  • This will open up a new modal now; let’s drag and drop a few widgets to create a form that we can use to add new projects into our database.
CleanShot 2022-02-03 at 01.10.07@2x.png

Here, we have two input widgets to add to our project. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new project on MariaDB.

Follow the steps below:

  • Click on the + icon next to the datasources and choose to Create New + from the MariaDB CRUD datasource
  • Rename the query to createProject
  • Copy the following SQL script:


Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose createProject under the events property:

CleanShot 2022-02-03 at 01.12.51@2x.png
Implementing the Update Operation

The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.

Now, rename the column to Edit Project, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.

Now, copy-paste Modal1, and set the onClick property of the Edit Project button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.

CleanShot 2022-02-03 at 01.14.09@2x.png

Let’s write the Edit query using SQL:

  • Click on the + icon next to the datasources and choose to Create New + from the MariaDB CRUD datasource
  • Rename the query to editProject
  • Copy the following SQL script:
UPDATE projects SET    project_name = {{Input1Copy.text}}'    created_at ='{{Input2Copy.text}}'WHERE  project_id = "{{Table1.selectedRow.project_id}}";


Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editProject_** under the events property:

Implementing the Delete Operation

The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:

  • Create a new custom column on the table by clicking on Add a New Column under the columns property.
  • Now, rename this column to ‘Delete Project,’ and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a button type.

Now, let’s write the Delete query using SQL:

  • Click on the + icon next to the datasources and choose the Create New + from the MariaDB CRUD datasource
  • Rename the query to deleteProject
  • Copy the following SQL script:
DELETE FROM projects  WHERE project_id = {{Table1.selectedRow.project_id}};


Set the Delete Project button’s onClick property to run the deleteProject query.

With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.

If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

A simple Front-end for Your SnowflakeDB Datasource
1
February
2022
Resources

A simple Front-end for Your SnowflakeDB Datasource

A simple Front-end for Your SnowflakeDB Datasource
Vihar Kurama
0
 minutes ↗
#
ui
#
admin
#
databases
#
crud
Resources

Designing UI elements can take up a majority of a developer’s time when building from scratch. However, with Appsmith, it’s possible to create a fully functional and custom frontend in minutes. A vast array of pre-built UI components, that is widgets are available to help you build good looking applications. Connecting data sources with Appsmith takes a few minutes, and you can easily build tools on top of the database of your choice. For example, you can build admin panels to manage product catalogues, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. The possibilities are countless.

In this blog, I will teach you how to build a frontend that can connect to SnowflakeDB as a datasource.

Snowflake is a cloud-based data warehouse-as-a-cloud-service (SaaS for DB) that requires no hardware or software installation. Snowflake handles the maintenance and tuning of cloud infrastructure. It is based on a new SQL database engine with unique features and advantages over a more traditional data warehousing technology approach.

Getting Started: Connecting Snowflake on Appsmith

On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including SnowflakeDB; be it on cloud, self-hosted version or local environment.

What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started.

  • Create a new account on Appsmith (it’s free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Next, click on the Now, navigate to the Create New tab and choose SnowflakeDB datasource, you’ll see the following screenshot:
CleanShot 2022-01-26 at 23.05.25@2x.png
  • When you’re using Snowflake cloud, all these details can be found under the SnowflakeCloud settings:
CleanShot 2022-01-26 at 23.07.29@2x.png
  • Rename the Datasource to SnowflakeDB CRUD by double-clicking on the existing one.

Here’s what the configuration would look like:

CleanShot 2022-01-26 at 23.09.58@2x.png
  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. If it returns a successful message, hit the ‘Save’ button to establish a secure connection between Appsmith and SnowflakeDB.

We are done with the basic configuration. Now, let’s use the default database from SnowflakeDB to build a fully customisable CRUD app.

Note: After the connection is established, we can see all the sample data (tables) under the connected datasource.

CleanShot 2022-01-26 at 23.13.45@2x.png

Now that we have the sample data, in the next section, let’s build a fully-fledged CRUD application (on the customer table) on top of our SnowflakeDB using Appsmith.

CRUD on SnowflakeDB with Appsmith

Implementing the Read Operation

First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:

  • Click on the + icon next to the datasources and choose to Create New + from the SnowflakeDB CRUD datasource
  • Rename the query to getCustomers
  • Copy the following SQL script to query all the Customers from the CUSTOMER table:
SELECT * FROM TPCDS_SF100TCL.CUSTOMER LIMIT 10;


  • This is a simple query that returns all the customers present in the sample data item. Hit the RUN button to view all the results.
CleanShot 2022-01-26 at 23.17.06@2x.png

We now have our query; let's bind this onto the table widget; for this follow the below steps:

  • Click the + icon next to widgets on the sidebar, search for the table widget, and drag and drop it onto the canvas.
  • Configurations to any widget can be made through the property pane. Click on the table widget on the canvas, you will see the property pane docked to the sidebar on the right. Now, under the Table Data property, use the moustache syntax to bind the query:
{{getCustomers.data}}


With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane.

CleanShot 2022-01-26 at 23.19.19@2x.png

Implementing the Create Operation

To add the create operation on SnowflakeDB, let’s make UI.

Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.

This will open up a new modal now; let’s drag and drop a few widgets to create a form that we can use to add new customers into our database.

CleanShot 2022-01-26 at 23.36.01@2x.png

Here, we have five input widgets to add our customers. We can configure the default values, labels, and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new customer on SnowflakeDB.

Follow the steps below:

  • Click on the + icon next to the datasources and choose to Create New + from the SnowflakeDB CRUD datasource
  • Rename the query to insertCustomer
  • Copy the following SQL script:

INSERT INTO TPCDS_SF100TCL.CUSTOMER (C_FIRST_NAME,C_LAST_NAME, C_BIRTH_YEAR, C_BIRTH_COUNTRY, C_EMAIL_ADDRESS) VALUES ({{Input1.text}}, {{Input2.text}}, {{Input3.text}}, {{Input4.text}}, {{Input5.text}});


Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

Lastly, we’ll need to configure the submit button; for this, go back to the modal and set the button’s onClick property to execute a query and choose insertCustomer under the events property:

CleanShot 2022-01-26 at 23.54.51@2x.png

Implementing the Update Operation

The Update operation is quite similar to the create operation. First, let’s build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under the columns property.

Now, rename the column to Edit Customer, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to update the item.

Now, copy-paste Modal1, and set the onClick property of the Edit Customer button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.

CleanShot 2022-01-27 at 00.44.39@2x.png

Let’s write the Edit query using SQL:

  • Click on the + icon next to the datasources and choose to Create New + from the SnowflakeDB CRUD datasource
  • Rename the query to editCustomer
  • Copy the following SQL script:
UPDATE TPCDS_SF100TCL.CUSTOMER SET    C_FIRST_NAME = "{{Input1Copy.text}}",    C_LAST_NAME = "{{Input2Copy.text}}",    C_BIRTH_YEAR = "{{Input3Copy.text}}",    C_BIRTH_COUNTRY = "{{Input4Copy.text}}",    C_EMAIL_ADDRESS = "{{Input5Copy.text}}",WHERE C_CUSTOMER_ID = {{Table1.selectedRow.C_CUSTOMER_ID}};


Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

We’ll now need to configure the submit button; for this, go back to Modal2 and set the button’s onClick property to execute a query and choose **_editCustomer_** under the events property.

Implementing the Delete Operation

The delete operation is pretty straightforward with the Table’s selectedRow property; before we dive into it, let’s create a new column on the table and set it to the button. For this:

  • Create a new custom column on the table by clicking on Add a New Column under the columns property.
  • Now, rename this column to ‘Delete Customer,’ and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a button type.

Now, let’s write the Delete query using SQL:

  • Click on the + icon next to the datasources and choose the Create New + from the SnowflakeDB CRUD datasource
  • Rename the query to deleteCharacter
  • Copy the following SQL script:

DELETE FROM TPCDS_SF100TCL.CUSTOMER_DEMOGRAPHICS  WHERE C_CUSTOMER_ID = {{Table1.selectedRow.C_CUSTOMER_ID}};


Set the Delete Character button’s onClick property to run the deleteCharacter query.

With these four operations configured, you will be able to read and analyze information from your database, edit the data, add or delete information and update records.


We’ve got extensive documentation on the SnowflakeDB datasource integration!

If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest.

Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

A Simple Front-end for Your ArangoDB Datasource
27
January
2022
Resources

A Simple Front-end for Your ArangoDB Datasource

A Simple Front-end for Your ArangoDB Datasource
Vihar Kurama
0
 minutes ↗
#
crud
#
dashboard
#
ui
#
databases
#
graph-database
Resources

A major pain point around building apps is designing the UI elements. Fortunately, with Appsmith, you create a custom frontend in minutes. Connecting datasources with Appsmith takes a few minutes, and you can easily build tools on top of the database of your choice. For example, you can build admin panels to manage product catalogs, read content data from your database and use that to populate your e-commerce website, and then write more data and update your existing orders in the database. The possibilities are countless.

In this blog, I will teach you how to build a frontend that can connect to ArangoDB as a datasource. 

ArangoDB is a free and open-source native multi-model database system developed by ArangoDB GmbH. The database system supports three data models with one database core and a unified query language AQL. Being multi-model, ArangoDB allows you to run ad-hoc queries on data stored in different models.

Getting-Started: Connecting ArangoDB on Appsmith

On Appsmith, it’s pretty straightforward to establish a connection with any datasource, including ArangoDB; be it on cloud, self-hosted version or local environment. 

What we need to make the connection are the endpoint, database name, and user credentials. With this in mind, let’s get started. 

  • Create a new account on Appsmith (it’s free!), if you are an existing user, log in to your Appsmith account.
  • Create a new application under the dashboard under your preferred organization.
  • On your Appsmith application, click on the + icon next to Datasources on the left navigation bar under Page1
  • Now, navigate to the Create New tab and choose ArangoDB datasource, you’ll see the following screenshot:


  • When you’re using ArangoDB cloud or a self-hosted instance, all these details can be found under the instance settings:


  • Ensure that the SSL mode is enabled so as to establish a secure connection.
  • Rename the Datasource to ArangoDB CRUD, by double clicking on the existing one.

Here’s how the configuration would look like: 


  • Next, click on the Test button at the bottom right of the screen. This will help you with understanding whether your configuration is valid or not. If it returns a successful message, hit the ‘Save’ button to establish a secure connection between Appsmith and ArangoDB.

Adding Seed Data on ArangoDB

We are done with basic configuration. Now, let’s create a collection on ArangoDb and push some data from Appsmith to the database. For this, you’ll need to open the ArangoDB endpoint and use the graphical user interface (GUI).


Let’s name the collection as ‘Characters’ and set the type as ‘Document’ 

Now let’s seed the collection with some data on Appsmith. 

For this, follow the steps below:

  • Click on the + icon next to Datasource and click on Create New + from the ArangoDB CRUD we’ve just created.
  • This will redirect you to the query pane, where you can write ArangoDB AQL queries, now rename the query to seedData, and choose to create from the template and paste the following in the query body:
LET data = [
    { "name": "Robert", "surname": "Baratheon", "alive": false, "traits": ["A","H","C"] },
    { "name": "Jaime", "surname": "Lannister", "alive": true, "age": 36, "traits": ["A","F","B"] },
    { "name": "Catelyn", "surname": "Stark", "alive": false, "age": 40, "traits": ["D","H","C"] },
    { "name": "Cersei", "surname": "Lannister", "alive": true, "age": 36, "traits": ["H","E","F"] },
    { "name": "Daenerys", "surname": "Targaryen", "alive": true, "age": 16, "traits": ["D","H","C"] },
    { "name": "Stannis", "surname": "Baratheon", "alive": false, "traits": ["H","O","P","M"] },
    { "name": "Melisandre", "alive": true, "traits": ["G","E","H"] },
    { "name": "Margaery", "surname": "Tyrell", "alive": false, "traits": ["M","D","B"] },
    { "name": "Jeor", "surname": "Mormont", "alive": false, "traits": ["C","H","M","P"]}}
]
FOR d IN data
    INSERT d INTO Characters


  • Lastly hit the RUN button on the top-right on the query. This will add some seed characters to our database. 

Now that we have our seed data, in the next section, let’s build a fully-fledged CRUD application on top of our ArangoDB using Appsmith. 

CRUD on ArangoDB with Appsmith

Implementing the Read Operation

First, let’s read our seed data from the database and display it on a beautiful table widget. Follow the below steps:

  • Click on the + icon next to the datasources and choose to Create New + from the ArangoDB CRUD datasource
  • Rename the query to readCharacters
  • Copy the following AQL script to query all the items from the Characters collection:
FOR c IN Characters
    RETURN c
  • This is a simple for-loop that iterates on the collection and returns every item. Hit the RUN button to view all the results.


We now have our query; let's bind this onto the table widget; for this follow the below steps:


  • Click the + icon next to widgets on the sidebar, search for the table widget, and drag and drop it onto the canvas.
  • Configurations to any widget can be made through the property pane. Click on the table widget on the canvas, you will see the property pane docked to the sidebar on the right. Now, under the Table Data property, use the moustache syntax to bind the query:
{{readCharacters.data}}


With this, we should see all the data displayed on the table. The column names can be configured and re-organized under the property pane. 

Implementing the Create Operation

To add the create operation on ArangoDB, let’s make UI. 

  • Drag and drop a button widget onto the canvas. Open its property pane, set the onClick property to Open a New Modal, and choose Create New.
  • This will open up a new modal now; let’s drag and drop a few widgets to create a form that we can use to create new characters in our database.


Here, we have three input widgets, a checkbox, and a multi-select widget to add our characters. We can configure the default values, labels and placeholders by selecting the respective property panes. Now, let’s write the query that lets us create a new item on ArangoDB. 

Follow the steps below:

  • Click on the + icon next to the datasources and choose to Create New + from the ArangoDB CRUD datasource
  • Rename the query to insertCharacter
  • Copy the following AQL script:
INSERT {
    "name": "{{Input1.text}}",
    "surname": "{{Input2.text}}",
    "alive": {{Checkbox1.isChecked}},
    "age": {{Input3.text}},
    "traits": {{MultiSelect1.selectedOptionValues}}
} INTO Characters


Here, we have an insert query that collects all the data from the form widgets we've created. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

Lastly, we’ll need to configure the submit button, for this, go back to the modal and set the button’s onClick property to execute a query and choose insertCharacter under the events property:


Implementing the Update Operation

The Update operation is quite similar to the create operation.

  • Let’s first build UI by creating a new custom column on the table by clicking on ADD A NEW COLUMN under columns property.
  • Now, rename the column to Edit Character, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a Button type. When clicked, a modal should open up with the necessary fields to be able to update the item. 
  • Now, copy paste Modal1, and set the onClick property of the Edit Character button to open Modal2. Here, in the form, we can also set the default value to show existing information, to display this, use the selectedRow property from the table widget.


Let’s write the Edit query using AQL:

  • Click on the + icon next to the datasources and choose to Create New + from the ArangoDB CRUD datasource
  • Rename the query to editCharacter
  • Copy the following AQL script:
REPLACE "{{Table1.selectedRow._key}}" WITH {
    "name": "{{Input1Copy.text}}",
    "surname": "{{Input2Copy.text}}",
    "alive": {{Checkbox1Copy.isChecked}},
    "age": {{Input3Copy.text}},
    "traits": {{MultiSelect1Copy.selectedOptionValues}}
} IN Characters


Here, we have an edit query that collects all the data from the form widgets on Modal2. Note that we use the moustache syntax to bind the data from the widgets onto the query body.

Next, configure the submit button, for this, go back to Modal2 and set the button’s onClick property to execute a query and choose editCharacter under the events property.


Implementing the Delete Operation

The delete operation is pretty straightforward with the Table’s selectedRow property, before we dive into it, let’s create a new column on the table and set it to the button. For this:

  • Create a new custom column on the table by clicking on Add a New Column under the columns property.
  • Now, rename this column to Delete Character, and click on the cog icon next to it, to configure column settings. Under this, we’ll see column-type properties set to a button type.

Now, let’s write the Delete query using AQL:

  • Click on the + icon next to the datasources and choose Create New + from the ArangoDB CRUD datasource
  • Rename the query to deleteCharacter
  • Copy the following AQL script:


REMOVE "{{Table1.selectedRow._key}}" IN Characters


Set the Delete Character button’s onClick property to run the deleteCharacter query. 


With these four operations configured, you will be able to read and analyze information from your database, edit the information, add or delete information and update records.

We’ve got extensive documentation on the ArangoDB datasource integration, along with a video explainer

If you’re interested in using a database that is not listed on our website as an integration, please let us know about it by raising a PR on Github and we will do our best to include it at the earliest. 
Join our growing community on Discord, and follow us on Youtube and Twitter to stay up to date.

The Modern Stack to Build Internal Tools: Supabase, Appsmith, n8n
8
November
2021
Resources

The Modern Stack to Build Internal Tools: Supabase, Appsmith, n8n

The Modern Stack to Build Internal Tools: Supabase, Appsmith, n8n
Vihar Kurama
0
 minutes ↗
#
applications
#
community
#
crud
#
automation
#
beginners
Resources

Developers spend quite a bit of time building internal tools, admin panels, and applications for back-office tasks that help automate everyday essential business processes. These involve multiple efforts, from maintaining a special database to writing lots of frontend and backend code. But, what if we told you that you could utilize a modern stack to build such applications that can help with your backend, frontend and automation tasks? Sounds good right? It is!

We’re happy to introduce a great new stack to build applications: The Supabase, Appsmith and n8n stack (SAN Stack) for developers to build and maintain modern custom internal tools.

What is the SAN Stack?

SAN stands for Supabase, Appsmith and n8n, after the three emerging and notable software that makes up the stack.

Supabase: The open-source firebase alternative to creating a backend in minutes. Start your project with a Postgres database, authentication, instant APIs, real-time subscriptions and storage.

Appsmith: An open-source framework to build custom business software with pre-built UI widgets that connect to any data source, and can be controlled extensively using JavaScript.

n8n: An extendable workflow automation tool. With a fair-code distribution model, n8n will always have visible source code, be available to self-host, and allow you to add your custom functions, logic and apps.

This stack lets you build any application within minutes. You can use Supabase for the database and backend, Appsmith for UI and adding functionality, and n8n for automating background tasks.

One of Appsmith’s co-founders and head of product, Nikhil Nandagopal broke down the basics of app building into three steps.

CleanShot 2021-11-08 at 11.20.37@2x.png

This has gained quite some traction among developers, especially those looking to build internal tools or applications.

Why Supabase, Appsmith, and n8n?

  • Free / Opensource: Supabase and Appsmith are fully open-sourced and can be self-hosted on their servers. While n8n follows a fair-code distribution model and always has visible source code, which is available to self-host.
  • Low-code yet high functionality: All three platforms follow the principles of the low-code model to help developers deploy and scale their applications in the fastest way possible. However, devs can utilize SQL, JavaScript, and data structures to customize their applications.
  • Editing Experience: Supabase, Appsmith, and n8n have an excellent UI interface and provide rich editing and debugging experience for developers right from the beginning. Both Appsmith and n8n provide a drag and drop interface for building UI and automation workflows, respectively. In comparison, Supabase offers a live SQL editor to test out and play with your database and has the power to export them into APIs directly from the platform.
  • Collaboration: When working with teams, all three platforms offer great collaboration tools; you can share these applications or workflows with anyone and set specific permissions such as view-only or edit mode. They are consistently being improved in their future roadmap.
  • Self-hosted: Developers can self-host all three platforms on their servers for free. It is useful when you want your data to be more secure, have complete control over customization, and have custom domain options.
  • Fantastic Community: The community is incredible across all three platforms; they provide excellent support and a transparency roadmap. New feature requests or existing bugs are immediately taken care of based on the priority. And with a great community, content gets better and better, and they provide rich documentation and many tutorials for devs to get started.

Build a Simple Ticket Manager Using SAN Stack

There are so many tools and applications that can be built across the SAN stack. Here are a couple of examples: An Employee Survey Dashboard and a Ticket Management Admin panel.

Using the SAN stack, you can build any dashboard in just minutes.

As an example, I will show you how to create a support dashboard manager application.

Using this application:

  • Users will be able to create or raise new tickets for a particular query
  • The support team will be able to see these tickets and assign them to engineers
  • When the tickets are resolved, we will be sending an automated email to the users
CleanShot 2021-11-08 at 12.20.33@2x.png

Let's get started!

Set up your Backend on Supabase

The first step is to set up the backend for the application; for this, we will be using a Postgres database on Supabase.

  1. If you are new to Supabase, you can create a new account (it's free!) or sign in with your existing credentials.
  2. You will be redirected to the Supabase dashboard; here, you can manage all your projects.
  3. Create a new project, and set the name to Support Dashboard. Create a new table by clicking on the Create Table option on the side navigation.
  4. Supabase gives us many ways to add data to the tables, from writing queries to creating schemas using UI to simply uploading CSV files; developers can choose any option.
  5. For our support dashboard, we will be creating a table by uploading a CSV file on Supabase.
CleanShot 2021-11-08 at 12.21.48@2x.png

The DB is now set up; let's use Appsmith to connect this PostgresDB and build UI for the application. For this, we might need to note down the connection info from project settings on Supabase. Following is how it looks like:

CleanShot 2021-11-08 at 12.22.58@2x.png

Build UI on Appsmith and Writing Queries

Our backend is ready; now, let's connect it to Appsmith to build UI and add functionalities. Follow the below steps:

  1. If you are new to Appsmith, you can create a new account (it's free!) or sign in with your existing credentials.
  2. After we sign in, we will be redirected to a dashboard to create a new application.
  3. Next, let's connect a new data source. To do this, click on the + icon next to the Datasources from the sidebar and choose PostgresDB.
  4. Now, copy the database connection details from Supabase to here and click on the test button to validate the authentication.
CleanShot 2021-11-08 at 12.23.58@2x.png

Awesome, we now have established a connection to our data source. Next, let’s build UI using widgets on Appsmith.

  • Click on the + icon next to widgets and drag and drop a Tab widget. We can configure using the property pane by clicking on the cog icon on the top-right corner.
  • As seen in the below screenshot, we have added four tabs to support the dashboard.
CleanShot 2021-11-08 at 12.24.46@2x.png
  • Now, we will add a button that should open a modal and have a form to raise a new ticket when clicked. For this, just drag and drop a new button widget from the widgets section and move it on canvas.
  • Add a few input widgets and a button to submit the form; this is how the form looks after the UI is complete:
CleanShot 2021-11-08 at 12.25.29@2x.png
  • We now have the UI to create a ticket. Let’s write two queries on Appsmith that will allow us to create tickets and list tickets. To do this, click on the + icon next to the data sources and use the Supabase connection here to create a new query.
  • Rename the query to create_new_ticket under the query pane; here we can write SQL that can collect inputs using moustache bindings. Following is how it looks like:
INSERT INTO PUBLIC."tickets"("id","createdAt","user","updatedAt","description",
"status","priority","category","assignedTo")
VALUES('{{appsmith.store.ticket.id}}','{{moment().format('yyyy-mm-ddHH:MM:ss')}}','{{c_user.text}}', 
'{{moment().format('yyyy-mm-ddHH:MM:ss')}}','{{c_description.text}}','{{c_status.selectedOptionValue}}',
'{{c_proporty.selectedOptionValue}}',
'{{c_category.selectedOptionValue}}','{{c_assignee.selectedOptionValue}}');
On Appsmith, we can use moustache bindings anywhere across the app to bind data or write javascript code to customize and add functionalities to your widgets.
  • Lastly, we now set the onClick property for the button to execute a query and select the create_new_ticket. And just like that, we should be able to create new tickets on the application.
  • Now, let’s write one more query where we could list all the tickets that users create. We will name this query get_tickets; the following is the SQL snippet.
SELECT * FROM public."tickets";
  • Now, drag and drop a table widget onto the tab widget under the Assigned To Me tab. Open the property pane and add the following snippet to bind the tickets:
{{get_tickets.data.filter(t => t.assignedTo === 'confidence@appsmith.com' && t.status !== 'closed')}}

Fantastic, we should be able to see all the tickets assigned to the specific user! It’s that’s simple to write custom JS to configure our internal applications on Appsmith. Now let’s use a webhook and build automation that sends Emails from the ticket using n8n!

Building an Extendable Workflow on n8n

If you want to build an internal tool that requires sending emails, then n8n is the way to go. n8n is a tool that can be used to automate workflows between your favorite web apps (such as Slack, Google Drive, Dropbox, etc.). However, n8n can be used to connect almost any two web apps that you use. Now, let's create a workflow and use a webhook to send requests to n8n from Appsmith.

  • If you are new to n8n, sing-up for their cloud version here.
  • Next, create a new workflow by selecting New under the workflow menu
  • Now, drag and drop a Webhook node onto the canvas; you can configure the nodes by clicking on them.
  • Now set the HTTP method to POST and copy the TEST URL

Awesome, now that we have the Webhook, let’s connect it with Appsmith by adding it as a data source.

  • On the appsmith application, click on the + icon next to the data source and create a new API.
  • Set the API type to POST and paste the API webhook URL,
  • Now paste the following code snippet under the body tab to collect input from the dashboard.
{"message": "({this. params. message})","email": (this. params. email})","ticket": "((appsmith. store. ticket. id}}"}
  • Next, connect a Gmail node to the webhook and authenticate with your Google ID.
  • To pass the data from the webhook to the Gmail node, configure the message property by using the variable selector nodes on the left pane.
  • Lastly, make sure you set the workflow state to active.

And just like that, we should be able to send Emails using n8n without writing any piece of code.

CleanShot 2021-11-08 at 12.28.38@2x.png

Building this app from scratch, including writing snippets of code, is likely to take 30 minutes! Isn’t that fast?

If you're looking for a modern approach to building internal applications, check out Supabase, Appsmith, and n8n! These tools are straightforward, powerful, and can help you build apps faster than ever. So what are you waiting for? Start building your next internal app today.

Connect Multiple Data Sources in Your App using these 6 Open Source Tools
16
August
2021
Resources

Connect Multiple Data Sources in Your App using these 6 Open Source Tools

Connect Multiple Data Sources in Your App using these 6 Open Source Tools
Vyom Srivastava
0
 minutes ↗
No items found.
Resources

While building an application, there are times when you have to use multiple data sources. It could be for security reasons or availability, or even cost issues. Often, connecting and managing those data sources can be challenging! You either need to build a system from scratch to connect those data sources or use a tool.

Let’s say you’re running an analytics dashboard, and you have data coming from multiple sources like MySQL, Redshift, S3, and MongoDB. Would you want your data scientists to focus on making all API connections, connecting all the data sources, and processing them in a specific format? Or, would you instead let them focus on the other critical parts of their job by using an easy solution and connect those data sources through an app?

Building a complete system from scratch is generally not advisable because of the enormous constraints on time. And, teams also have to go through the documentation for each data source to build a system from scratch. They also have to keep track of everything, such as changelogs/updates in the documentation, and change the code base accordingly. Handling this work would mean hiring resources dedicated to this work or putting the team under pressure to manage all this.  Using an application to oversee the connection makes sense while ensuring that data sources are secure and reliable.

Building everything from scratch can be a time-consuming part that may also distract you from your path.

To help you stay on course, we’ve put together a nifty list of great tools to help you connect multiple data sources without hassles.

Talend

Talend is an open-source data automation tool that can be used to connect multiple data sources. It makes the process of ETL (Extract Transform and Load) pipeline setup more uncomplicated and quicker. It offers a scalable architecture and robust data integration to maximize its value and a suite of open-source tools divided into several components. All combined, they become a potent tool for ETL and connecting multiple data sources.

Talend provides an easy and intuitive way to transform the data. Instead of mapping databases and filling out the forms for different databases, you can just use their graphical tool for mapping and transforming the data.

It also supports data conversion into multiple business formats such as OLAP, Jasper, SPSS, Splunk.

Supported data sources:

  • MSSQL
  • DB2
  • PostGres
  • MySQL
  • Teradata
  • Greenplum
  • DB2
  • Oracle
  • Sybase
  • Vertica
  • Informix

Arcesb

Arcesb is another open-source data integration and pipeline setup tool that can connect multiple data sources. It synchronizes the data in real-time, which means as soon as data gets available in the source, it gets publicly available through the Arcesb.

It supports a wide array of protocols such as AS2, AS4, OFTP, SFTP, etc. The drag-and-drop approach makes it easy to connect complex workflows and quickly transform the data in popular formats like JSON, XML, and CSV.

Supported data sources:

  • MongoDB
  • Dropbox
  • Google Sheets
  • MySQL
  • CSV
  • HubSpot
  • SQLite
  • AWS S3

Apache Camel

Apache Camel is an open-source data integration tool that is a kind of underrated software in ETL. It can be deployed as a standalone application in a web container, connecting complex workflows easily and quickly transformingJEE, OSGi, or even Spring container. It allows programmers to split integration problems into smaller pieces that close the gap between integration and implementation.

It has 3.4k stars on Github and 4.3k Github forks. It supports almost every available protocol like HTTP, HTTP, FTP, JMS, EJB, JPA, RMI, JMS, JMX, LDAP, Netty, and many more. Therefore, it is good to use when you’re dealing with multiple internet protocols. One of the best parts of Apache Camel is that it uses the same workflow for all the supported protocols.

Supported data sources:

  • MySQL
  • ActiveMQ
  • Async HTTP
  • REST API
  • AWS
  • Azure
  • Cassandra CQL
  • MongoDB
  • Digital Ocean
  • FTP/SFTP
  • Google BigQuery
  • GoogleSheets
  • GraphQL
  • RabbitMQ and many more which are listed here

Pentaho Kettle

Pentaho Kettle, also known as the Pentaho Data Integration tool, supports multiple data integrations, OLAP, data mining, reporting, and ETP capabilities. It is also known for its ease to use and quick learning curve.  It allows users to create their data manipulation jobs with a user-friendly graphical creator without entering a single line of code.

It comes with a set of tools that includes:

  • Spoon: This is a graphical tool that lets you create and set up ETL pipelines without using a single line of code. You can perform tasks like validation, refine, transform and write the data to various data destinations.
  • Pan: This is a tool to run data transformations.
  • Chef: This tool is used to create jobs.
  • Kitchen: This tool is used to execute jobs created by Chef.
  • Carte: It is a web server where you can host all your Pans.

Hevo Data

Hevo is a not-code ETL tool that you can use to connect multiple data sources. It supports more than 100 data sources and follows a 3-step data connection setup across all the data sources: select the data source, enter the correct credentials and select the destination.

It uploads the data to the selected destination and allows you to perform data manipulation and transformations. It also comes with a fault-tolerant system that ensures that the data is being transferred in a secured manner consistently using encryption.

Supported data sources:

  • AWS
  • MySQL
  • MongoDB
  • Snowflake
  • TokuDB and many more which you can check here

Appsmith

We have created Appsmith in such a way that you can easily connect multiple data sources with just a few steps. You can even pass the data through an API using our platform. The drag-and-drop UI makes the connection part very easy and allows you to connect data sources like AWS, MySQL, MongoDB, etc.

The credentials that you store in our application are first encrypted before storing. So, you don’t have to worry about security. Since our platform acts as a proxy layer, we do not store any data from your data sources.

To improve the performance, our platform creates a pool of connections with the database server. This tool allows you to run multiple queries as it is impossible to run multiple queries simultaneously in a single link. We have also created in-depth tutorials that you can check out to understand the benefits of other features as well.

Supported data sources:

  • Amazon S3 (also Upcloud, Digital Ocean Spaces, Wasabi, DreamObjects)
  • ArangoDB
  • DynamoDB
  • ElasticSearch
  • Firestore
  • MongoDB
  • MS SQL
  • MySQL
  • PostgreSQL
  • Redis
  • Redshift
  • Snowflake

Are you interested in building something with Appsmith? Take it for a spin! Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter.

Building an Admin Dashboard with React-admin
29
April
2021
Resources

Building an Admin Dashboard with React-admin

Building an Admin Dashboard with React-admin
Confidence Okoghenun
0
 minutes ↗
#
admin
#
dashboard
#
frontend-frameworks
Resources

React admin has been one of the holy grail frontend frameworks for building responsive admin panels. It offers a lot of really cool features such as data validation, optimistic rendering, accessibility, and action undo. React-admin is also plug-and-play as it supports standard REST APIs and a handful of GraphQL dialects. Being a Reactjs framework, it also gives you access to thousands of plugins and libraries available in Javascript and the React ecosystem.

In this article, I would like to show you how to build an admin panel using React-admin.

We’re going to be building a dashboard to manage DVD movie rentals for a local rental store. The first page would have a table listing all registered members of the store. The second page will have a table that holds all rental records. From here, new rental entries can be created and existing rentals can be updated i.e from borrowed to returned. We would also be able to click on a customer from the first page and then be taken to the rentals page to see his rental history.

Here’s a gif and a link to the completed application

You can view the demo app here

Dashboard link: as-react-admin.netlify.app

username: cokoghenun@appsmith.com

password: 123456

Through building this dashboard, we’re going to cover core React-admin concepts such as

  • Resources
  • List view
  • Edit/Create view
  • Reference inputs and
  • Authentication

Since React-admin requires an API server we would need to build one on top of the database. Speaking of the database, we’ll be making use of MongoDB and the demo dataset is a modified version of the Sakila dataset.

To save time and get to the fun part of building the dashboard with React-admin, we’ll be making use of Loopback to generate a Nodejs API over the database. If you are not familiar with Loopback, it is a highly extensible Node.js and TypeScript framework for building APIs and microservices.

You can skip this if you already have an API to use

We’re almost set. But before we begin, I’d like to give you a mini-map of the entire article. The first part of this article will focus on generating an API server over the database on MongoDB using Loopback. The second part of this article would cover how to use React-admin to build a dashboard from the API generated in the first section.

Alright, everything looks good. Let’s get started!

Generating an API server

There are many ways to build an API server. You can roll up your sleeves and build one yourself(this takes a lot of time) or you can choose to go with a framework. Loopback is the fastest framework I found to build Nodejs APIs over a database. It supports a host of databases ranging from in-memory to document to relational databases.

The API that would be generated using Loopback will have three resources, the first being the customer resource that represents customers who come to rent DVDs from the store. We also have the film resource, representing DVDs that are in stock. Lastly, we have the rentals resource, which records each rental.

Here’s the schema for each resource

// Customer resource{
  "store_id": String,
  "first_name": String,
  "last_name": String,
  "email": String,
  "address_id": String,
  "activebool": Boolean,
  "create_date": Date,
  "last_update": Date,
  "active": Number
}


// Film resource
{
  "title": String,
  "description": String,
  "release_year": Number,
  "language_id": String,
  "rental_duration": Number,
  "rental_rate": Number,
  "length": Number,
  "replacement_cost": Number,
  "rating": String,
  "last_update": Date,
  "special_features": String,
  "fulltext": String
}

// Rental resource
{
  "status": String,
  "rental_date": Date,
  "film_title": String,
  "customer_email": String,
  "return_date": Date,
  "staff_id": String,
  "last_update": Date
}

Okay! Now let’s get started by install Loopback CLI with npm

npm i -g @loopback/cli

We can easily scaffold the Nodejs server using the Loopback CLI. It configures a Typescript compiler and installs all required dependencies. Let’s run the CLI and answer a few prompts to generate a new app

lb4 app

You should have your app configured as shown below

Hit enter and give the CLI some time to set up the app.

Creating a model

Now that the loopback app has been scaffolded, cd (change directory) into the app folder, and let’s start by creating a model for each resource. A model communicates the shape of each document for a particular resource, much like the schema shown earlier.

Let’s create a model for the customer resource using the Loopback CLI

lb4 model

As we did when generating the app, answer the CLI prompts. Yours should look like this

3-configuring the customer model.png

Great Job! Now, go ahead and do the same for the film and rental resources. Don’t forget that to create a new model, you’ll need to run the lb4 model command.

Connecting to the database

Next, we’ll need to link the Loopback app to the Mongo database. Loopback provides two entities to help us accomplish this, and they are the datasource and repository mechanisms.

A datasource represents a database connection that would be used to store and retrieve documents from the database i.e MongoDB or PostgreSQL. On the other hand, a repository links a resource on the Loopback app to a particular table or collection in the database. For example, the customer resource is linked to the Customer collection in the database using a repository.

Now, let’s add a datasource to the app, and link it to our MongoDB database. We can easily do this using the CLI command below

lb4 datasource

As usual, go ahead and answer the CLI prompts, supplying the database credentials to the CLI

4-configuring the mongo datasource.png

Awesome! Now we can add a repository for each resource.

Run the command below and let’s set up a repository for the customer resource. Notice that we have to link the created resource to the target resource, and in this case, it is the customer resource

lb4 repository

Cool! Go ahead and do the same for the film and rental repositories. I’m confident you can finish up on your own 😜

Adding CRUD functionality

Great Job! That was a lot we just covered. Right now, we have models for each resource, a datasource, and repositories linking each model to its respective collection in the database.

The last piece of the puzzle is to add CRUD functionality for each resource.

We can do this by creating controllers. Controllers do the grunt work of creating, reading, updating, and deleting documents for each resource.

As you may have already guessed, we can create a controller using the controller command. Now, let’s create a REST controller for the customer resource. Notice we’ll need to use the model and repository created earlier for the customer resource.

lb4 controller
Note that the Id is a string and is not required when creating a new instance
6-creating a controller.png

As usual, go ahead and do the same for the film and rental resources.

Awesome! We now have a full-blown REST API that was generated in a few minutes. Open up the project folder in your favorite code editor and you’ll see all the code(and folders) generated by Loopback.

I recommend you change the default port in the index.ts file to something else i.e 4000 because Create React App (used by React-admin) runs by default on port 3000

You can start the server using the start script

npm start

You can find a playground and the auto-generated API documentation for your server by visiting the server address on your browser i.e http://localhost:4000/

7-api explorer.png

Alright! Now we have a REST API server with CRUD functionality, we can move on with creating the admin dashboard for using React-admin.

Enter React-admin

We’ve finally gotten to the fun part, yay!

As a quick recap, we have a Loopback API generated in the last section that serves the customer, film, and rental resource with the following endpoints and data schema

// /customers endpoint
{
  "store_id": String,
  "first_name": String,
  "last_name": String,
  "email": String,
  "address_id": String,
  "activebool": Boolean,
  "create_date": Date,
  "last_update": Date,
  "active": Number
}


// /films endpoint
{
  "title": String,
  "description": String,
  "release_year": Number,
  "language_id": String,
  "rental_duration": Number,
  "rental_rate": Number,
  "length": Number,
  "replacement_cost": Number,
  "rating": String,
  "last_update": Date,
  "special_features": String,
  "fulltext": String
}

// /rentals endpoint
{
  "status": String,
  "rental_date": Date,
  "film_title": String,
  "customer_email": String,
  "return_date": Date,
  "staff_id": String,
  "last_update": Date
}

So here’s the game plan. We’re going to use this API to build a dashboard to manage DVD movie rentals. The first page would be a table showing all customers. Then we can click on a customer and view all his rentals on a new page. We can update the return date and status of each rental i.e from borrowed to returned. Lastly, we can view all rentals on the rentals page and create a new entry or edit an existing one.

Phew! Now we can finally begin with React-admin 😅

React-admin is a powerful front-end framework for building admin panels and dashboards. It is highly customizable and has a host of other great features. Since it is based on Reactjs, it can be used with thousands of other Reactjs and Javascript libraries.

React admin requires a base Reactjs project. We are going to be going with Create-React-App (CRA) in this article. So let’s set up the project with CRA

npx create-react-app rental-admin-panel

Give the CLI some time to install all dependencies and finish setting up the project. Then, cd into the project directory and go-ahead to install React-admin and the Loopback dataprovider.

npm install react-admin react-admin-lb4

A dataProvider is the mechanism with which React-admin communicates with a REST/GraphQL API. The Loopback provider for React-admin enables it to understand and use Loopback APIs i.e how to paginate or filter requests. If you aren’t using a Loopback generated API, you should look into using one of these dataProviders for React-admin.

Open up the project in your favourite code editor and replace everything in the App.js file with the below starter code

//src/App.js

import React from 'react';
import lb4Provider from 'react-admin-lb4';
import { Admin, Resource } from 'react-admin';

function App() {
  return (
    // ------------- Replace the below endpoint with your API endpoint -------------
    
      
    
  );
}

export default App;

So far so good. But we have some new concepts to clear up. In the starter code above, we supply a dataProvider to React-admin which enables it to query the API. The next thing we did up there is to register a resource from the API that we would like to use in React-admin. This is done simply by supplying the endpoint as a name prop to the <Resource> component.

You don’t need to add the forward-slash “/” to the resource name

Going by this rule, we must register it as a resource whenever we need to query a new API endpoint. In this way, React-admin becomes aware of it. Moving on...

Creating the Customers' table

The easiest way to view all customers’ info is to have a paginated table displaying all customers’ info. React-admin makes it easy to do this by providing us with a <List> component.

The <List> component generates a paginated table that lists out all documents in a particular resource. We can choose which fields we want to show up on the table by wrapping them in the appropriate <Field> component i.e a date property on a document would be wrapped in a <DateField> component.

The data property on the document is linked to the <Field> component using the source prop. This prop must contain the exact property name. And the field name showing up on the table can be customized using the label prop.

We can also create a filter for the table using the <Filter> component and specify an action to be triggered whenever an item is clicked on the table using the rowClick props on the <Datagrid> component. You can learn more about filtering here and row actions here

Alright! So we want a customer table to show all the customers. We also want this table to be filterable by customer email. Lastly, we want to be able to click on a customer and see all his rentals (we haven’t created the rentals page yet, but we will shortly).

Let’s see all of this in action. Go ahead to create a customer list component with the following content

//src/components/CustomerList.js

import React from 'react';
import { List, Filter, Datagrid, TextField, SearchInput, } from 'react-admin';

// ------------- filter component which filters by customer email -------------
const CustomerFilter = (props) => (
  
    
  
);

const CustomerList = (props) => (
  } title='List of Customers'>
// ------------- rowclick action that redirects to the rentals of the selected customer using the customer id -------------
     {
        return `/rentals?filter=%7B%22customer_email%22%3A%22${record.email}%22%7D&order=ASC&page=1&perPage=10&sort=film_title`;
      }}
    >
      
      
      
      
      
    
  
);

export default CustomerList;

Next, we need to link the <CustomerList> component with the customer resource component.

//src/App.js

// ------------- import CustomerList -------------
import CustomerList from './components/CustomerList'; 

//…

// ------------- use CustomerList as the list view on the customer resource -------------

Save your code and let’s head over to the browser. You can see we have a nice paginated, and filterable customer table that has been automatically generated and is rendering customer information from the API. Cool right? 😎

8-customer table ui.png

Not so fast! Go ahead and create a similar list table for the rental resource. You can name this component RentalList. If you are curious or get stock, feel free to fall back on the code here.

Creating and Editing a Rental

We have two more views to create and they are the edit and create view for the rental resource. They are quite similar to each other and are both similar to the list view but with a few differences.

The edit view would be used to edit an item clicked on the rental table.

To wire up this behaviour ensure that you have rowClick='edit' on the <Datagrid> component in <RentalList>

An edit view uses a <SimpleForm> component, which in reality is a simple form with nested <Input> components. Like with the <Field> components, each <Input> component used is based on the data type of the property to be edited i.e a <TextInput> component is used on a text property. Inputs also require the source props and optional label props as we’ve already seen with the <Field> component.

Bringing it all together, the edit view for the rental resource would look like this:

Notice that some inputs have been disabled using the disabled props
// src/components/RentalEdit.sj
import React from 'react';
import {
  Edit,
  SimpleForm,
  TextInput,
  DateTimeInput,
  SelectInput,
} from 'react-admin';

const RentalEdit = (props) => (
  
    
      
      
      
      

      
      
    
  
);

export default RentalEdit;

Don’t forget to import and use the edit view in the rental resource component in your App.js file.


//src/App.js

// ------------- import RentalEdit' -------------
import RentalEdit from './components/RentalEdit'; 

//…

// ------------- use RentalEdit as the edit view on the rental resource -------------
 

Save your files and let’s head to the browser. Click on an order to see the magic!

Okay, so we’ve completed the edit view. Now moving on to make the create view.

The create view is quite similar to the edit view. It’s so similar that I’m just going to paste the code right here and you wouldn’t be able to tell the difference. Just kidding 😜. Anyway, here’s the code for the create view

// src/components/RentalCreate.js
import React, { useState, useEffect } from 'react';
import {
  Create,
  SimpleForm,
  DateTimeInput,
  SelectInput,
  useNotify,
  useRefresh,
  useRedirect,
  useQuery,
  TextInput,
} from 'react-admin';

const RentalCreate = (props) => {
  const notify = useNotify();
  const refresh = useRefresh();
  const redirect = useRedirect();

  const onSuccess = ({ data }) => {
    notify(`New Rental created `);
    redirect(`/rentals?filter=%7B"id"%3A"${data.id}"%7D`);
    refresh();
  };

  const [customers, setCustomers] = useState([]);
  const { data: customer } = useQuery({
    type: 'getList',
    resource: 'customers',
    payload: {
      pagination: { page: 1, perPage: 600 },
      sort: { field: 'email', order: 'ASC' },
      filter: {},
    },
  });

  const [films, setFilms] = useState([]);
  const { data: film } = useQuery({
    type: 'getList',
    resource: 'films',
    payload: {
      pagination: { page: 1, perPage: 1000 },
      sort: { field: 'title', order: 'ASC' },
      filter: {},
    },
  });


  useEffect(() => {
    if (film) setFilms(film.map((d) => ({ id: d.title, name: d.title })));
    if (customer)
      setCustomers(customer.map((d) => ({ id: d.email, name: d.email })));
  }, [film, customer]);

  return (
    
      
        
        
        
        

        

        
      
    
  );
};

export default RentalCreate;

The only difference here is that we have two select inputs that display a list of all customers and films by manually querying those resources.

Instead of writing custom logic to query the customer and film resources, we could have easily use the built-in <ReferenceInput> component. But currently, there's no way to set the selected value from the <SelectInput> component to something other than the document id. In the create form, we require the email field from the customer resource and the title field from the film resource. That is why we are manually querying, else the <ReferenceInput> component would have been awesome.

Do not forget to import and use the create view we just made. Also, register the film resource in App.js

//src/App.js

// ------------- import RentalCreate -------------
import RentalCreate from './components/RentalCreate';


//…

// ------------- use RentalCreate as the create view on the rental resource -------------
 
// ------------- register the film resource -------------
 
If a resource is registered and no list view is passed to it, React-admin hides it from the navbar. But the resource is still useful for querying as we did for the film select input in the <RentalCreate> component.

This is the moment you’ve been waiting for! Save your files and head over to the browser. You’ll notice that we now have a create button on the rentals table, and clicking on a rental takes you to edit that rental. Sweet!

We’ve finally completed the dashboard! 🥳 🎉 🎊

We have a complete admin panel to manage rentals. We can see a list of customers, select a customer and view all his orders and lastly, we can create new rental entries or edit existing ones. Awesome!

For some extra credit, let's add some authentication.

Extra credit: Authentication

You must add some authentication to your apps, else anyone would be able to use it, even malicious individuals! Thankfully, adding authentication to our API and admin dashboard is not too difficult.

The first part of this section will focus on adding authentication to the Loopback API. You can skip this if you’ve been following along with your API. Next, we’ll look at implementing auth on the React-admin panel.

Securing the API

Loopback has various authentication strategies that we can implore to secure the API. We are going to be going with the JWT authentication strategy, mostly because it’s super easy to set up and is fully supported by React-admin.

Enough talk, let's get started by installing the JWT auth extension library and Validatorjs on the Loopback API server.

npm i --save @loopback/authentication @loopback/authentication-jwt @types/validator

Next, bind the authentication components to the application class in src/application.ts

// src/appliation.ts

// ----------- Add imports -------------
import {AuthenticationComponent} from '@loopback/authentication';
import {
  JWTAuthenticationComponent,
  SECURITY_SCHEME_SPEC,
  UserServiceBindings,
} from '@loopback/authentication-jwt';
import {MongoDataSource} from './datasources';
// ------------------------------------

export class TodoListApplication extends BootMixin(
  ServiceMixin(RepositoryMixin(RestApplication)),
) {
  constructor(options: ApplicationConfig = {}) {

    //…

    // ------ Add snippet at the bottom ---------

    // Mount authentication system
    this.component(AuthenticationComponent);
    // Mount jwt component
    this.component(JWTAuthenticationComponent);
    // Bind datasource
    this.dataSource(MongoDataSource, UserServiceBindings.DATASOURCE_NAME);
    // ------------- End of snippet -------------
  }
}

Great job! We now have a foundation for auth.

Authentication usually works by validating the credentials of the user attempting to sign in and allowing him to go through if valid credentials are supplied. Thus, we’ll then need to create a user resource to represent a user. For our purposes, a user only has an id and an email field.

Alright, let’s create the user model using the Loopback CLI. Answer the CLI prompts as usual

lb4 model

11-user model.png

We’ll also need to create a controller for the user resource that handles all authentication logic. You can use the CLI to generate an empty controller.

Note that this controller would need to be an empty controller and not a REST controller
lb4 controller
12-user controller.png

The generated empty controller file can be found in src/controllers/user.controller.ts. Copy the contents of the file linked here into your controller file. It contains all the authentication logic. You can find the file here

Visit the link above and copy its contents into the user.controller.ts file

Finally, we can secure the customer resource by adding the authentication strategy we just implemented to its controller. Here’s how to do it:

// src/controllers/order.controller.ts

// ---------- Add imports -------------
import {authenticate} from '@loopback/authentication';

// ------------------ Add auth decorator -----------
@authenticate('jwt') // <---- Apply the @authenticate decorator at the class level
export class CustomerController {
  //...
}

Do the same for the film and rental resources by adding the authentication strategy to their respective controller files.

And that’s it! Visiting the API playground on the browser http://localhost:4000/explorer/ you’ll notice we have a nice green Authorize button at the top of the page. We also now have signup and login routes to create user accounts and log in.

You’ll need to use this playground/explorer to create a new user

Now, let’s use this authentication on the React-admin dashboard.

Adding authentication to React-admin

Implementing authentication on the React-admin dashboard is fairly straightforward. We need an authProvider which is an object that contains methods for the authentication logic, and also a httpClient that adds the authorization header to every request made by the dashboard.

Create an Auth.js file in src/Auth.js that contains the authProvider method, and the httpClient function. Here’s what the content of the file should be

// src/Auth.js

export const httpClient = () => {
  const { token } = JSON.parse(localStorage.getItem('auth')) || {};
  return { Authorization: `Bearer ${token}` };
};

export const authProvider = {
  // authentication
  login: ({ username, password }) => {
    const request = new Request(
      process.env.REACT_APP_API_URL + '/users/login',
      {
        method: 'POST',
        body: JSON.stringify({ email: username, password }),
        headers: new Headers({ 'Content-Type': 'application/json' }),
      }
    );
    return fetch(request)
      .then((response) => {
        if (response.status < 200 || response.status >= 300) {
          throw new Error(response.statusText);
        }
        return response.json();
      })
      .then((auth) => {
        localStorage.setItem(
          'auth',
          JSON.stringify({ ...auth, fullName: username })
        );
      })
      .catch(() => {
        throw new Error('Network error');
      });
  },
  checkError: (error) => {
    const status = error.status;
    if (status === 401 || status === 403) {
      localStorage.removeItem('auth');
      return Promise.reject();
    }
    // other error code (404, 500, etc): no need to log out
    return Promise.resolve();
  },
  checkAuth: () =>
    localStorage.getItem('auth')
      ? Promise.resolve()
      : Promise.reject({ message: 'login required' }),
  logout: () => {
    localStorage.removeItem('auth');
    return Promise.resolve();
  },
  getIdentity: () => {
    try {
      const { id, fullName, avatar } = JSON.parse(localStorage.getItem('auth'));
      return Promise.resolve({ id, fullName, avatar });
    } catch (error) {
      return Promise.reject(error);
    }
  },
  getPermissions: (params) => Promise.resolve(),
};

Alright! Now let’s make use of the authProvider and httpClient in our app. Import authProvider and httpClient from ‘Auth.jsintoApp.jsand passhttpClientas a second parameter tolb4Provider. Then add an authProvider prop to theAdmincomponent and pass inauthProvider` as its value.

Simple and easy!

// ----------- Import Auth -------------
import { authProvider, httpClient } from './Auth';

//…

// ------------ Use httpClient and authProvider in the Admin component ---------
 //...

Save the files and head back to the browser and you’ll be greeted with a login screen. Fill in the email and password of your registered user and you’ll be taken to the customers’ table like before.

And that’s it! We now have a super-secured app! 💪

Deploy 🚀

We now have a fully functional admin dashboard with authentication. Lastly, I’ll like to walk you through deployment to your favourite cloud provider.

Since the API generated using Loopback is a standard Nodejs server, you can deploy your app to any Nodejs hosting provider i.e Heroku or Glitch. But note that you will need to move all packages under devDependencies to the dependencies section in your package.json file.

And for the React-admin dashboard, you can deploy it on any static hosting service i.e Netlify or Vercel. Don’t forget to replace the lb4Provider URL with that of your hosted backend.

How To Display Data Visually From a Database in under 5 Minutes
1
September
2021
Resources

How To Display Data Visually From a Database in under 5 Minutes

How To Display Data Visually From a Database in under 5 Minutes
Vyom Srivastava
0
 minutes ↗
#
databases
#
analytics
#
dashboard
#
mysql
Resources

If you’re dealing with numbers, graphs and charts happen to be the best way to make sense out of those numbers! Charts and graphs can help you understand your data and help you make decisions based on them. With Appsmith, you can connect your data sources in just a few steps and generate beautiful graphs and charts.

As of now, Appsmith supports the following databases (in addition to using any REST APIs):

  • MySQL
  • MS SQL
  • Postgres
  • Amazon S3
  • Google Sheets
  • Snowflake
  • Redshift
  • Redis
  • Elastic Search
  • Firestore
  • ArangoDB

In this article, we will display data from our MySQL database to our app on Appsmith.

Step 1: Create app, and connect database

Login to your Appsmith account or create a new account (if you don’t have one). Once you're logged in, click on the “Create New” button:

Screen Shot 2021-08-23 at 10.35.33 AM.png

After that, click on the Build with Drag & Drop widget. You’ll see this:

Screen Shot 2021-08-23 at 10.39.42 AM.png

On the left sidebar, you can see an option for “Datasources”. Click on the “+” button and it will open a list of all supported data sources by Appsmith:

Screen Shot 2021-08-23 at 11.20.45 AM.png

For this blog, I will use the Mock Database (provided to all users of Appsmith to help with quick prototyping), which is based on MySQL; you can go for your preferred database.

NOTE: Since I am using Mock Database, it’s not asking for database credentials. But, when you select another data source, you’ll need to enter the database credentials. Let’s say you want to connect to a new data source. You’ll see the below page:

Screen Shot 2021-08-28 at 5.49.10 PM.png

Once you select the data source (for example, MySQL), you now need to enter your credentials:

Screen Shot 2021-08-28 at 5.49.43 PM.png

Step -2: Add the first query and widget

Once you have added the database (in this case, selected the data source), you should see the list of all the tables present in your database.

Here’s how it will look:

Screen Shot 2021-08-23 at 11.21.49 PM.png

This allows you to perform CRUD operations against those tables. Let’s try to add our first widget. Now, click on the “+” button in front of Datasources. You should see the list of all connected data sources. Now click on “+ New Query” for the data source you want to run the query. Since we’re using a mock database so we’ll select that:

Screen Shot 2021-08-28 at 5.56.11 PM.png

Once you click the “+ New Query” button, you now need to click on the “Select” option from the list to run the SELECT queries:

Screen Shot 2021-08-28 at 6.02.01 PM.png

This will open a MySQL editor where you need to paste the below query:

SELECT count(public."users"."name"), date(public."users"."createdAt") FROM public."users"group by date(public."users"."createdAt") order by date(public."users"."createdAt") asc;
Screen Shot 2021-08-28 at 6.05.07 PM.png

Now, to add the widget, we need to add it as a Chart. So, just select the chart widget, which is available at the right sidebar of the page.

Screen Shot 2021-08-24 at 8.44.21 AM.png

Step -3: Add Stats for Orders

Hover the cursor on "public.orders" and click on the “Add” button and select the SELECT command option. Now, paste the below query:

SELECT count(public."orders"."deliveryAddress"), public."orders"."orderDate" FROM public."orders" group by public."orders"."orderDate"  LIMIT 7;

This will return the orders data for the last seven days. We now need to select the chart widget so that the data can be displayed.

Screen Shot 2021-08-24 at 8.52.29 AM.png

Step -4: Add Stats for Revenue

Hover the cursor on “public.orders” and click on the “Add” button and select the SELECT command option. Now, paste the below query:

SELECT sum(public."orders"."orderAmount"), public."orders"."orderDate" FROM public."orders" group by public."orders"."orderDate"  LIMIT 7;

This will return the revenue data for the last seven days. We now need to select the chart widget so that the data can be displayed.

Screen Shot 2021-08-24 at 8.56.52 AM.png

Step -5: Adding employees’ data

Hover the cursor on “public.employees,” and click on the “Add” button and select the “SELECT” command option. Now, paste the below query:

select public."employees"."employee_id", public."employees"."employee_name", public."departments"."department_name" from public."employees" LEFT JOIN   public."departments"  on public."departments"."department_id" = public."employees"."department_id";

This will return the employees’ data which includes employee id, employee name, and employee department. We now need to select the chart widget so that the data can be displayed.

Screen Shot 2021-08-24 at 9.00.25 AM.png

Step -6: Add To-Do List for Employees

Hover the cursor on “public.standup” and click on the “Add” button and select the “SELECT” command option. Now, paste the below query:

SELECT public."standup_updates"."date",public."standup_updates"."name", public."standup_updates"."notes" FROM public."standup_updates" LIMIT 10;

This will return the employees’ standup data which includes the date, employee name, and employee notes. We now need to select the chart widget so that the data can be displayed.

Screen Shot 2021-08-24 at 9.03.34 AM.png

Step -7: Deploying the dashboard

Once we’re done with adding and aligning all the widgets, we also need to deploy it; just click on the “Deploy” button at the top of the page including, and it’ll get deployed instantly!

screencapture-app-appsmith-applications-61231d24695af65b46d7449b-pages-61231d24695af65b46d7449d-2021-08-24-09_12_41.png

You can check out the live example here also.

Final Words

Displaying data from the database is very easy with Appsmith. Our plug-n-play UI allows you to add any database and display the data in graphs, charts, tables, dropdown or even as a normal text. You can even add a lot more functionalities in this dashboard like create new orders, create/update employee data, or even perform CRUD operations using our widgets to your database tables.

Test and Deploy your APIs Using These Open Source Tools
28
July
2021
Resources

Test and Deploy your APIs Using These Open Source Tools

Test and Deploy your APIs Using These Open Source Tools
Vyom Srivastava
0
 minutes ↗
#
apis
#
open-source
#
testing
#
app-development
Resources

Testing an API is one of the most important phases of the API development lifecycle. It ensures that the API you’re deploying on the server is bug-free and highly optimized. But the testing phase can be very complex as it involves different types of testing, such as load testing, regression testing, security testing, etc. There are a lot of challenges developers and testing teams face during the testing of API. Let's discuss them first:

Why is API Testing a Difficult Task?

We all know that APIs involve a lot of modules that have a lot of functionalities. Let’s take a simple example of an e-commerce application. You’re going to have many endpoints such as /login, /logout, /cart, /wishlist, /profile, and so on. You need to ensure that each endpoint is delivering what it is supposed to. For example, /cart should only show products associated with a particular profile and shouldn’t mix with other products.

An application like the one mentioned above can have about 300-400 endpoints or sometimes even more! On top of that, you need to make sure that the validations are working fine, response time is low or at least optimized, there’s no bug in the API, and good performance even when 1000s of requests are being made simultaneously. You also need to ensure that the API returns an appropriate status code such as 20x, 40x, 50x, etc. All this makes API testing not only tricky but also a time-consuming task.

To reduce the complexity of the whole testing and deployment process of the API, there are a lot of open-source tools available on the internet (if you prefer non-open-source tools, you can check out Postman or Firecamp ). These tools not only save a lot of time but also give you insights like the response time of the API, among others.

Once you’re done with testing, you can deploy the API on a server. Deployment is the process where you’re ready to go live with the API and need to move it to the live server. Every time you make a change to the API, you need to redeploy the API on the server (after testing, obviously! 😬)

Here's our list of open source tools that you can use to test and deploy your API:

SoapUI

Screen Shot 2021-07-27 at 10.54.02 PM.png

SoapUI is another API tool that allows you to test and deploy your APIs. This is one of the most matured and trusted API testing tools. One of the unique features of this tool is, it supports SOAP APIs too. The tool is mainly used for QA and API testing. It also allows you to connect external data sheets to retrieve data for executions.

Soap UI also allows you to send multiple API requests; triggering a single test case and supports a wide variety of testing such as load testing, functional testing, security testing, etc.

Pros

  • Easy to define variables and pass them in parameters.
  • Support for SOAP API

Cons

  • Slower to perform tests on complex APIs
  • Not completely free
  • UI takes a little bit of getting used to

Apache Jmeter

Screen Shot 2021-07-27 at 10.58.03 PM.png

Apache Jmeter is an open-source testing tool that not only tests APIs but scripts too. You can create your own test cases, and it’ll perform different types of testing like module testing, regression testing, etc.

The UI is quite simple and easy to use. You can test the APIs in two ways: either use direct API requests or write a code to make the requests to an API endpoint. The tool is entirely written in Java and supports multiple languages such as Python, C, Java, etc.

It also comes with a marketplace where you can just download the plugins to expand the platform’s functionalities. It supports multiple protocols such as FTP, HTTP, LDAP, SOAP, etc. JMeter also supports graphs and charts, so the results can be visualized easily. To perform UI testing, you can run Selenium test cases as well.

Pros

  • Free to use and completely open source
  • Can be connected to third party platforms like Jenkins
  • Can be scaled easily
  • Marketplace is a unique feature

Cons

  • Very slow
  • UI is dated

Hoppscotch

Screen Shot 2021-07-27 at 10.59.48 PM.png

Hoppscotch, previously known as Postwoman, is another popular open-source API development and testing platform. It has a dark UI and a minimalistic design scheme. It is one of the fastest API testing tools allowing you to send requests and copy responses in real-time.

It comes with a variety of themes and you can even install it as a PWA (Progressive Web App) on your mobile device. The tool also lets you make a full-duplex communication channel over a single TCP, in other words you can make Websocket connections. Another big feature of this tool is that you can also test GraphQL queries.

Pros

  • Support for Websocket
  • PWA
  • Easily create documentation

Cons

  • Comes in web variant only
  • It doesn’t support testing like regression, load, etc

Karate

Screen Shot 2021-07-27 at 11.07.24 PM.png

This platform has been developed by Intuit and is used for multiple purposes like API testing, deployment, creating mock test servers, web browser automation, etc. It is written in Java but doesn’t require the end-users to write anything in Java. It’s so easy to use that even non-programmers can write the test cases. It supports YAML as andV, so you can easily use them to write data drives tests. You can also perform cross-browser-based Web UI testing.

Pros

  • Support for multiple testing
  • Includes a lot of functionalities

Cons

  • It doesn’t have a great UI so you might have to write a lot of code

Insomnia

Screen Shot 2021-07-27 at 11.08.14 PM.png

Insomnia is another open-source tool that lets you track the development and deployment of API endpoints very easily. It uses a Swagger-inspired editor, so if you’re familiar with it you’ll be able to easily use this tool.

It allows you to import API specs from OpenSpec API as well as Postman. It also comes with a CLI tool called Inso, which lets you go in-depth with the API testing. You can also connect version control software like GitHub, Bitbucket, SVN, etc.

Pros

  • Support for .env files
  • Support for Gitsync

Cons

  • Process to test the API is a bit lengthy

What’s Next?

Now that you’re equipped with your APIs, you can use Appsmith to create full-fledged applications by connecting your data to our extensive repository of pre-built UI widgets like forms, buttons, lists, maps and so much more. And since Appsmith is a GUI based platform, you can drag and drop these widgets to create your applications. You can also invite your colleagues to collaborate with them and then deploy it to be shared with internal or external users.

Psst! You can connect your data on Appsmith either through APIs or through our native integrations with popular databases like Postgres, MongoDB, and Snowflake, among others, as well as apps like Google Sheets!

610002bbe68fa271933cee6e_ABK8x87Kne_y4oROzmhaoMwpShsXIcXMr_VubVdsoztQaUGrIxFdgKnwTNUm_Pb4vEDGDNjuVk1t0UgKWrOWSKaZ5pF1HIWNdqm4kqNg6_nPuTZTgaXEAJepVbZKRuW3SHAdmi4u.png

Also, by running CURL commands directly on the platform, you can test and deploy your apps easily and quickly.

Are you interested in building something with Appsmith? Take it for a spin. Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter!

Five Open Source Database Managers to Connect Your Data
4
August
2021
Resources

Five Open Source Database Managers to Connect Your Data

Five Open Source Database Managers to Connect Your Data
Vyom Srivastava
0
 minutes ↗
#
databases
#
open-source
#
mysql
#
mongodb
Resources

The database is one of the most critical parts of an application. Why? All the information you receive is stored in the database; the app pulls up that information in the way you want it. So the first step to building an app would be to connect your data. It’s no wonder that databases dominate the world of apps.

Some of the proprietary databases can be expensive and tend to offer limited technologies for data storage; however, you can always choose to go with open-source databases for your next project. You can self-host open-source databases and configure them as you like since the source code is available. Not just this, open-source databases are very flexible as well as cost-effective. There are a lot of applications that use more than one technology for data storage. For example, to deal with real-time data like (for example, data of real-time visitors), MySQL is not a good choice because it is not designed for high concurrency as it takes a lot of time to run multiple queries at the same time. App developers tend to go with a database like MongoDB as it supports a high level of concurrency. However, there could be a situation where the data science team for the same application would probably use MySQL for running complex queries. See how developers can use more than one database technology to connect and perform CRUD operations.

Why should you use a database manager?

  • It helps you to manage multiple databases at the same time.
  • It helps you to optimize your queries by providing meaningful insights such as query execution time, size of the data, etc.
  • You can easily update your databases.
  • They’re safer to use since the credentials are being encrypted first and then stored.

There are many database managers available out there, but only a few of them support multiple database technologies. Any good database manager should be able to support multiple databases and the following features:

  • Code linting
  • Error Highlight
  • Persistent Database Connection
  • Ability to store multiple credentials in a secured way
  • Ability to generate raw code if required

Of course, needless to say, the more the merrier! The ones I’ve mentioned above are the minimum standard across leading open-source database managers.

We’ve curated a list of some of the popular database managers for your next project. Dive right in!

OmniDB

Screen Shot 2021-08-04 at 4.44.00 PM.png

Omnidb is an open-source database manager which provides a collaborative environment for developing and managing your database. It comes in two variants: a hosted version and a stand-alone app version.

It’s powered by WebSockets and allows you to run multiple queries on multiple databases in the background efficiently. The application is built keeping simplicity in mind, making it one of the most user-friendly database managers.

It also supports multiple tabs, which allow you to write clean code and make multiple database connections in one session. It also has a smart SQL editor, which comes with linting, auto-complete, beautifies, etc.

Pros:

  • Support for multiple database technologies like MariaDB, MySQL, Oracle, PostgreSQL.
  • Dark theme support
  • Support for SSH terminal.
  • Monitoring dashboard: it provides a graphical interface to monitor all your configured units using Python scripting and other configuration files.
  • Auto-complete

Cons:

  • Doesn’t support NoSQL databases

HeidiSQL

Screen Shot 2021-08-04 at 5.02.18 PM.png

HeidiSQL is another open-source database manager which is extremely user-friendly and lets you connect multiple databases. It is one of the most potent database managers out there and enables you to create tables, logs, and manage users on MySQL databases and other database technologies.

This database was initially developed to make connections with MySQL only. However, it was extended to the MS SQL server, and now it also includes PostgreSQL support. HeidiSQL’s UI is pretty clean and allows you to create multiple connections. Once you install it, a setup screen follows, collecting essential information like IP, port, username, and password.

You can also export your data in CSV, Excel, CSV, HTML, SQL, LaTex, Wiki Markup, and PHP array. You can also edit multiple tables together by using the bulk edit option. Not just this, The monitor allows you to kill the costly processes.

Pros:

  • Supports multiple database technologies.
  • Export data in multiple formats.

Cons:

  • HeidiSQL's UI is minimal, and there's definitely room for improvement, but you can create multiple connections seamlessly.
  • Doesn’t support NoSQL databases

RockMongo

Screen Shot 2021-08-02 at 11.19.35 PM.png

RockMongo is an open-source MongoDB GUI tool that is written in PHP and is an entirely web-based application. It looks very similar to PHPMyAdmin and comes with a classic 90s UI (Windows 98 style buttons and a non-responsive layout). It supports all the common standards that make it easy to work with the collections, stats, etc.

Like the ones above, you can connect and store the credentials of multiple MongoDB servers, but it doesn’t support tabs. This means you can only work on a single MongoDB connection at a time.

Another drawback of using this tool is the dependency on the PHP server. To run and execute queries on this tool, you need to install and run a PHP server.

Pros:

  • Very light-weight
  • Easy to use

Cons:

  • Dependency on PHP server.
  • Doesn’t support multiple tabs.

Robo 3T

Screen Shot 2021-08-04 at 5.04.23 PM.png

Robo 3T, formerly known as RoboMongo, is another open-source MongoDB GUI client. The application is available for platforms like Ubuntu, Mac, and Windows. It comes embedded with the default MongoDB shell and allows you to run complex queries in a much faster way.

Robo 3T is one of the most popular projects on Github, which means there's an experienced community to help you out. Since it uses the default MongoDB shell, the consumption of resources is likely to be relatively low.

Pros:

  • Auto completion
  • MongoDB shell
  • One of the fastest database managers.

Cons:

  • The UI becomes cluttered sometimes.

Navicat

Screen Shot 2021-08-04 at 5.05.12 PM.png

Navicat is another powerful database management and design application that supports multiple drivers and databases. It comes in a standalone application for Mac, Windows, and Linux and allows you to manage drivers like MySQL, MariaDB, SQL Server, SQLite, Oracle & PostgreSQL DB very easily.

This application comes with a lot of functionalities like export to excel, stored procedures, scheduling out of the box, and data transfer. One of the interesting features is data transfer, it allows you to transfer tables from one database to another even if they’re not on the same server.

Pros:

  • Support for multiple databases and drivers.
  • Data transfer functionality.
  • Available on all platforms.

What’s next?

Once you're done with your database development and design you’ll need a platform to work with the data right? You can use Appsmith and easily connect your databases. Let's say you’re building an API, you can use Appsmith to connect the database and deploy your APIs. Or let's say you want to fetch data from a database and plot a graph using the data. You can very easily use our drag-and-drop widgets to create a graph and deploy it.

Guess what, you can also connect your data on Appsmith either through APIs or through our native integrations with popular databases like Postgres, MongoDB, and Snowflake, among others, as well as apps like Google Sheets!

610a2b25eaeb381bf46cab45_Da_9-DEJByCND9d3I8Nn7edxWsIMG3pMfle3l7q3p-SGYp4ha7_g5WaDabixDg5p7fKc0WOBzyVXL5vaHhg0vvRxUkzOmePdvSNOkWJ7isz9sDTmZyCp_yqBfUxNlk39ebouSEH2.jpeg

Are you interested in building something with Appsmith? Take it for a spin. Join our vibrant community on Discord. To get regular updates on what we’re up to, follow us on Twitter!

How to Run Manual Jobs in Gitlab CI/CD
12
March
2021
Resources

How to Run Manual Jobs in Gitlab CI/CD

How to Run Manual Jobs in Gitlab CI/CD
Vihar Kurama
0
 minutes ↗
#
ci-cd
#
gitlab
#
ui
#
databases
#
automation
Resources

Gitlab is eating the world, or so we thought till we moved to GitHub as an opensource company. That should tell you enough about our love for Gitlab, but there is one thorny problem with Gitlab CI that didn't have a solution in any pricing tier. It's the ability to trigger CI jobs with custom parameter values manually. This article will explore the benefits and drawbacks of manual jobs, potential workarounds, and finally, how using Gitlab API and forms, we can get around this problem.

Why are manual jobs important in CI/CD?

Although CI is primarily aimed to be continuous (I didn't see that coming, did you?), there is still a precious place for manual triggering of jobs in a CI pipeline. This becomes especially apparent when we consider parameters in the build process. For example, you are working on a separate branch that does a PoC of a new feature. You want to ensure that the CI can handle the codebase in this branch but don't want to push your official organization. So, if your build script took parameters (with sane defaults, of course) that can be configured to have the image pushed to your personal organization, you'll be able to test it to your heart's content. What's more, you can even run the Docker image and check all is in order. Basically, you don't break the app.

Another simple example would be when you have a specific set of servers you want to push your changes to from a specific branch in your repo. This is a case of CD, with finer control. You can use the same deploy pipeline but set the branch and server(s) variables and run this job to get the desired outcome. This becomes incredibly useful as the organization scales and has multiple branches, teams, and servers, especially in this age of mono-repos.

If you're a pipeline wizard, you can probably spin off a new script each time for these jobs and run these scripts each time, but a more practical solution would be to use a pipeline template and have a few variables (set defaults for a standard run) that can be populated for manual runs. This is where Gitlab comes a bit short.

gitlab.png

I have no idea what I am doing here!

How do I run manual jobs using GitLab?

If you're willing to compromise on everything else Gitlab has to offer, Jenkins is your go-to for CI/CD and boy does it ship with an out of the box manual job runner. But of course, who wants to compromise on all the other awesome stuff from Gitlab?

The biggest inconvenience stems from the fact that Jenkins and Gitlab are two separate systems. It's up to the developer to ensure that these two talk to each other well and stay on good terms. For example, If the source code on Gitlab moves from being a Java project to a Golang project, the Jenkins build jobs must be configured to stop using Maven and start using go. Having Gitlab be the integrated solution for both your source code and your CI pipelines is just too convenient. The overheads of running Jenkins and Gitlab will just make your team grumble much more than usual.

Let's say you do end up integrating with Jenkins, are you sure this is enough for you to sacrifice all of Gitlab's CI/CD efficiencies for?

project-build.png

Do you want to integrate this with your repo?

Manual Jobs For Gitlab

Assuming you haven't ripped apart your Gitlab instance and installed Jenkins, and you're still asking yourself "So how does one trigger manual jobs for Gitlab?", the main (and surprisingly the simplest) suggestion is to build a form for each job and use Gitlab API to trigger it - from the discussion for this issue from four years ago (remember, Google is your friend). Why don't we give that a shot? Sounds simple enough, right?

<html lang=en>  
<head>    
<meta charset=utf-8>    
<title>Gitlab CI Trigger</title>  
</head>  
<body>    
<form method=post action='gitlab-api-endpoint'>      
<p>        
<label for=branch>Branch</label>        
<select id=branch name=branch>          
<option value=master>master</option>          
<option value='feature/one'>feature/one</option>          
<option value='feature/two'>feature/two</option>          
<option value='bug/that-nasty-bug'>bug/that-nasty-bug</option>        
</select>      
</p>      
<p>        
<input type=checkbox id=debug name=debug>        
<label for=debug>Print debug output</label>      
</p>     
<p>        
<label for=reason>Reason for trigger</label>        
<input name=reason id=reason>      
</p>      
<p>        
<button type=submit>Run pipeline!</button>      
</p>    
</form>  
</body>
</html>

We're not even getting started here 😰

screen.png

Voila? 🤔

But a few things to remember before we deploy that in production. You need to update the branch list whenever that changes, which is almost every day. Also, remember to add or remove any parameters along with your build pipeline. Every time you run a manual job, you essentially have to rebuild this file, or scope all potential manual jobs and have files ready. This is really inconvenient.

Beyond just being inconvenient, this method doesn't scale. Even if you maintain this HTML form as part of your source code, it still warrants a significant maintenance effort. You don't want to spend time building/debugging/fixing/maintaining your pipeline-runner, you want to build the software you are actually working on.

The better way to manage CI/CD workflows

Fear not, Appsmith is here. While not an official workaround, appsmith makes it incredibly easy to create your forms and set up the API needed to run manual forms. Building the form will be easier than in Jenkins, API calls can be managed gracefully in the interface, AND you can build a job runner that looks like this in 10 minutes.

We've built a small tutorial to walk you through how we create a manual pipeline to deploy your code on multiple production servers. Let’s get started!

Appsmith is an open-source cloud or self-hosted platform to build admin panels, CRUD apps and workflows. Appsmith helps you speed up application building through fast UI creation, and making you write code only when necessary.

Here’s a video to get you started!

We’ll be building a dashboard on Appsmith to simplify and manage a simple CI/CD workflow that allows developers to deploy software by selecting a specific branch and server.

Additionally, we’ll create options to

  • View a Specific CI/CD Workflow
  • Cancel a Running Workflow
  • Retry a Failed Workflow
  • Delete a Workflow
  • Save Workflow History

This dashboard is going to be created entirely using Appsmith and the Gitlab API.

We’ll be using a Gitlab repository named ci_cd to follow along and set up workflows. It’s a basic “Hello World!” Node.JS application that is deployed to multiple servers on Heroku. We have our .gitlab-ci.yml file configured in such a way as to run two jobs. The first job is to perform a test with the npm test command, and the second is t deploy our app onto Heroku using a ruby gem called dpl.

The entire workflow is configured dynamically so that it can be deployed to any Heroku server given a Heroku <app-name> and an <api-key>. To hide these key’s we’ll be setting these out in our environment variables to the workflow. Lastly, on Heroku, we have two deployment servers appsmith-ci-cd-server1 and appsmith-ci-cd-server2.

Below is a snippet showing the content of .gitlab-ci.yml file.

image: node:latest
stages:
  - test
  - deploy

testing:
  stage: test
  script: npm test

deploying:
  stage: deploy
  image: ruby:latest
  script:
    - gem install dpl
    - dpl --provider=heroku --app=$heroku_app_name --api-key=$heroku_api_key

Building Dashboard on Appsmith

Now that we have a clear picture of what’s cooking, let’s head to Appsmith and start building our Dashboard. If you haven’t signed up, you can get started for free, here.

Next, let’s head over to our Appsmith dashboard and create a new application by clicking on the Create New button. We’ll be now redirected to a new application named “Untitled Application 1”. Now, let’s give our app a creative name, let’s call it Deploy Dashboard. Our first step is to build a feature that would list all the existing and future workflow history. To do this, let’s utilise Gitlab’s pipeline API, fetch all the details and display it onto the Appsmith table widget.

Let’s go ahead and configure our workflow API Gitlab.

Setting up Gitlab API as Datasource

In this section, we’ll be using the GitLab API as a data source to configure it with the Appsmith Deploy Dashboard application. Let’s now create a new API, we can do it by navigating to the API section under Page1 and clicking on Create New option.

Paste the following in the request form and click on Save as Data Source

https://gitlab.com/api/v4/projects/:id

Next, let’s follow the below steps to display all the existing workflows onto the Appsmith dashboard.

  1. Firstly, the id in the above GitLab API is a parameter to access details of a particular repository on Gitlab, you can find it on your repository page on GitLab.
  2. To make things more clear, let’s rename the Datasource name as GitLab, we can do it by double-clicking on the existing one.
  3. Next, let’s add a new key named PRIVATE-TOKEN under the headers section. We can find the value of PRIVATE-TOKEN from GitLab.
  4. Navigate to the Preferences -> Access Tokens on GitLab and create a new access token and paste it to the PRIVATE-TOKEN key on the Appsmith data source. (Also be sure to give it at list api scope)
  5. Lastly, save the data source!

Getting Workflows List

Our Datasource is up and running, now let’s head back to the API section on Appsmith and create a new API to fetch all the existing workflows on the repository. Now create a new API and set the name as get_pipelines. On clicking on the URL we have suggestions to use our previously configured GitLab data source. Use the suggestions and add /pipelines to its end. The URL should now look like the following:

GET https://gitlab.com/api/v4/projects/:id/pipelines

Hit the Run button and you will be greeted with an array of the workflow linked to that repository! Sweet, isn’t it?

Now to make things cooler, let’s build UI to display all these CI/CD workflows on the Appsmith page. Click on the widgets on the navigation bar and drag and drop a table widget onto the canvas. You should see a new table with some pre-populated data. We should also see a floating pane, that consists of all the table properties. Under that pane, edit the Table Data property to the following:

{{get_pipelines.data}}

Now, we can see data from the get_pipelines API rendered on the table. You can go ahead to rearrange the column and disable columns that you don’t want showing up on the table i.e sha and updated_at.

table.png

Triggering A Pipeline

Now let’s add a new feature to trigger a new workflow on the same dashboard. To do this, we’ll create a new button by dragging and dropping a button widget. Rename the button to Trigger New Pipeline. Also, drag in a modal widget to the canvas. The button should be configured such that the modal is opened whenever it’s clicked. On the other hand, the modal’s type should be set to form modal and we’ll drag in two dropdowns with corresponding labels into it. The first dropdown should be configured to select a branch with the following options:

[
  {
    "label": "master",
    "value": "master"
  },
  {
    "label": "staging",
    "value": "staging"
  }
]

Similarly, we configure the second dropdown to show the server options that are configured on Heroku namely, appsmith-ci-cd-server1 and appsmith-ci-cd-server2:

[
  {
    "label": "appsmith-ci-cd-server1",
    "value": "appsmith-ci-cd-server1"
  },
  {
    "label": "appsmith-ci-cd-server2",
    "value": "appsmith-ci-cd-server2"
  }
]

Perfect, we should now see a great looking modal on our Appsmith dashboard.

form.png

Let’s head back to the API section and create a new API to trigger a workflow whenever the Confirm button from our modal is clicked. Let’s name this API as create_pipeline and set the value to the following:

POST https://gitlab.com/api/v4/projects/:id/pipeline

Additionally, we have the option to provide variables and ref (meaning the source branch) in the body of this endpoint. We should configure the body as given in the below snippet.

{
    "ref": "{{Dropdown1.selectedOptionValue}}",
    "variables": [
       {
            "key": "heroku_app_name",
            "variable_type": "env_var",
            "value": "{{Dropdown2.selectedOptionValue}}"
        },
       {
            "key": "heroku_api_key",
            "variable_type": "env_var",
            "value": "your_heroku_api_key_here"
        }
    ]
}

By looking at the above snippet, the ref key is obtained from the branch dropdown, which was previously configured and in the variables section, the value of the heroku_app_name key is obtained from our server dropdown.

"You can find the value of heroku_api_key from your Heroku account under the Settings-> API Keys section."

Lastly, let’s head back to the modal and configure the onclick action of the confirm button to trigger the create_pipeline endpoint. Add the following JS snippet to the onclick property under the button settings.

{{
create_pipeline.run(() => get_pipelines.run( closeModal('Modal1')), () => {})
}}

Kudos! With this, we should be able to trigger a workflow from the Appsmith dashboard itself. Let’s now fine-tune this to have more features in the next section.

More Fine Tuning

Alright, in the last section, we’ll now add fine-grained controls such as deleting, cancelling, retrying, and viewing workflows. These are quite similar to each other, let’s look at the delete workflow option and you can try adding the rest as an exercise :)

Deleting the CI/CD Workflow from Appsmith Dashboard

To implement this feature, let’s add head back to our table and add a new “Custom Colum”. You can find it under the table settings pane. Let’s name this column as delete. To make our dashboard more intuitive, we can set the column type to button and add the label as delete. Now let’s create a delete API under the APIs section and name it as delete_pipeline. Use the following endpoint to add functionality.

DELETE https://gitlab.com/api/v4/projects/:id/pipelines/{{Table1.selectedRow.id}}

This API grabs the id of the selected row from the table, which is automatically set when the delete button is clicked. Heading back to the delete button, let’s configure the onclick action to run the delete_pipeline api and call the get_pipelines api on success to update the table. Here’s the configuration JS snippet:

{{delete_pipeline.run(() => get_pipelines.run(), () => {})}}

Perfect, now we have the ability to delete a specific workflow from the Appsmith dashboard itself.

"Disclaimer: Clicking on this button will delete that pipeline for good. As a safety measure, you can add a confirmation modal to prevent accidental delete."

Here’s a quick lookup to configure the other actions to play with our workflows:

Retry Workflow:

POST https://gitlab.com/api/v4/projects/24888719/pipelines/{{Table1.selectedRow.id}}/retry

Cancel Workflow:

POST https://gitlab.com/api/v4/projects/24888719/pipelines/{{Table1.selectedRow.id}}/cancel

Final Thoughts

We are glad you made it this far. As a quick summary, here’s a gif to show everything we’ve built so far!

Honestly, that was a lot to take in, but we’ve been able to go through a complete flow of how you can build a custom dashboard to manage your ci/cd process. Also, here's a quick demo of the app that we've built! And we are quite sure you’re bubbling with ideas on the limitless use cases you can build with this. So go into the world and build amazing stuff!

Image Credits: Photo by Ryan Quintal on Unsplash

JUnit Test With Containerized MySQL database (using IntelliJ IDEA)
25
January
2021
Resources

JUnit Test With Containerized MySQL database (using IntelliJ IDEA)

JUnit Test With Containerized MySQL database (using IntelliJ IDEA)
Sumit Kumar
0
 minutes ↗
#
mysql
#
testing
#
junit
#
java
#
ides
Resources

JUnit is one of the most popular unit testing frameworks used with Java to create repeatable tests. With JUnit each test is written as a separate method inside a Java class. IntelliJ IDEA provides an option to run these test cases from within the IDE.

In case you have a module that communicates with a MySQL database, you can unit test the module by providing it access to a MySQL server running inside a testcontainer. You can also configure this MySQL database instance with your desired username, password and database name (in MySQL server) using the API provided by Testcontainers framework.

In case you use Maven to manage dependencies in your project, as used in Appsmith , you can add the following snippet in your POM file to include all the required packages:

    org.testcontainers
    testcontainers
    1.15.1
    test

To create a new MySQL testcontainer instance with JUnit 4 you may follow these steps as used in Appsmith's unit test file to test its MySQL plugin:

public static MySQLContainer mySQLContainer = new MySQLContainer("mysql:5.7")
            .withUsername("username")
            .withPassword("password")
            .withDatabaseName("test_db");

Please note that Testcontainers framework is different for JUnit4 and JUnit5. Please use the framework as per the JUnit version that you have used. For more details please see Testcontainers page.

Databases spwaned using Testcontainers when run from within the IDE can seem to become inaccessible from outside the IDE. In order to connect to such databases you can uses the database tool that comes with IDEA ultimate version.

Steps to connect to the MySQL database:

1.Add a debug point in the code such that the testcontainer has been brought up at this point.

Screenshot 2020-12-17 at 10.58.27 AM.png

2. Run the test program using debug mode and wait till it stops on the break point.

Screenshot 2020-12-17 at 10.59.27 AM.png
Screenshot 2020-12-17 at 11.09.16 AM.png

3. Click on the database tool.

Screenshot 2020-12-17 at 11.10.00 AM.png

4. Select your database type.

Screenshot 2020-12-17 at 11.12.06 AM.png

5. Fetch your credentials. You may read the credentials from the testcontainer using the following API when using with JUnit 4.

address  = mySQLContainer.getContainerIpAddress();
port     = mySQLContainer.getFirstMappedPort();
username = mySQLContainer.getUsername();
password = mySQLContainer.getPassword();
database = mySQLContainer.getDatabaseName();

6. Test your connection and save credentials.

Screenshot 2020-12-17 at 11.17.17 AM.png

7. Run query.

Screenshot 2020-12-17 at 11.19.44 AM.png

It is noteworthy that Testcontainers provide containerized instances of many other popular databases as well, like Postgres, MongoDB and Neo4j. Similarly, IntelliJ IDEA's database tool also provides connectivity support for most of the popular databases. The steps described above, to integrate the testcontainers package or to investigate the containerized database, can be used with databases other than MySQL as well. In summary, the steps to write a unit test using JUnit and any testcontainer can be generalized as follows:

  1. Add dependency for JUnit package.
  2. Add dependency for testcontainers package.
  3. Write code snippet to start a containerized instance of your desired database.

The steps to investigate the containerized database instantiated above can be generalized as follows:

  1. Add a debug point after the container is instantiated but before the test ends.
  2. Start the test and wait till the execution stops at the debug point.
  3. Use the database tool to connect to the containerized database and run queries on it.

In case you need access to more code examples to see the above steps in usage, do checkout the test files in Appsmith's GitHub repository. I hope this article was useful to you and do share your feedback in comments.

Connect to a Database Instance Running Inside a Java Testcontainer Using IntelliJ IDEA
17
December
2020
Resources

Connect to a Database Instance Running Inside a Java Testcontainer Using IntelliJ IDEA

Connect to a Database Instance Running Inside a Java Testcontainer Using IntelliJ IDEA
Sumit Kumar
0
 minutes ↗
#
databases
#
java
#
ides
#
testing
#
container
Resources

Testcontainers library (Java) is generally used along with Junit test cases to spawn a test environment within a docker container. IntelliJ IDEA provides an option to run these test cases from within the IDE.

Databases spwaned using testcontainers when run from within the IDE can become inaccessible from outside the IDE. In order to connect to such databases you can uses the database tool that comes with IDEA ultimate version.

Steps:

1. Add a debug point in the code such that the testcontainer has been brought up at this point.

Screenshot 2020-12-17 at 10.58.27 AM.png

2. Run the test program using debug mode and wait till it stops on the break point.

Screenshot 2020-12-17 at 10.59.27 AM.png
Screenshot 2020-12-17 at 11.09.16 AM.png

3. Click on the database tool.

Screenshot 2020-12-17 at 11.10.00 AM.png

4. Select your database type.

Screenshot 2020-12-17 at 11.12.06 AM.png

5. Enter your credentials. Test your connection and save credentials.

Screenshot 2020-12-17 at 11.17.17 AM.png

6. Run query.

Screenshot 2020-12-17 at 11.19.44 AM.png