12 ways to get more GitHub stars for your open-source project

The steps that we took to grow from 0 to 4,500 stars on GitHub within a few months.

We launched ToolJet (https://github.com/ToolJet/ToolJet) in June 2021, since then we’ve got more than 4500 stars for our repository. Here is a list of things that worked for us. This is not an article about how to just get more stars for your repository. The article instead explains how to present your project well so that it is helpful for the open-source community. Some of these points have helped us get contributions from more developers, we have contributions from more than 100 developers now.

PS: The graph above was generated using an app built with ToolJet. You can use it here to generate a star history chart for your project – https://apps.tooljet.com/github-star-history

1) Readme matters

Readme is the first thing that a visitor to your repository sees. The readme should be able to convey what your project does, how to install the project, how to deploy the project ( if applicable ), how to contribute and how it works. Also, use badges that are helpful for the developers. We used https://shields.io/ for adding badges to our Readme.

Here is how our Readme looks like:


Examples of projects with great Readme:
a) https://github.com/nestjs/nest
b) https://github.com/typesense/typesense
c) https://github.com/airbytehq/airbyte
d) https://github.com/strapi/strapi

2) Documentation

We get more traffic to our documentation portal (https://docs.tooljet.com/) than our main website. A well-documented project is always loved by the community. Open-source projects like Docusaurus makes it super easy to build documentation portals that look great just out of the box. Adding links to the repository from the documentation can drive more visitors to your repository.

Here are some projects with great documentation:
a) https://docs.nestjs.com/
b) https://docs.n8n.io/
c) https://guides.rubyonrails.org/
d) https://plotly.com/python/
e) https://docs.mapbox.com/

3) Drive visitors from your website to GitHub


A lot of visitors checked out our repository after visiting our website first. Add banners, badges, etc to your website so that the website visitors will check out your repository. To drive more visitors to your website, writing blog posts about relevant topics helps.

4) Be active in developer communities

There are many discord/slack communities, forums, Reddit communities, etc where developers usually hang out. Be active in these communities without making it look like self-promotion ( which can get you banned for obvious reasons ). Try to add value to the communities by participating in relevant discussions. For example, if you are building a charting library and if someone is asking a question about plotting charts using React, you can pitch in to help.

Play nice. Do not try to link to your project if it does not add any value to the discussion.

5) Email campaigns

You might already have users signed up for your website. Add a link to your GitHub repository in the welcome email.

Do not spam people who haven’t signed up for your updates.

6) Trending repositories on GitHub


If you make it to the list of trending GitHub repositories ( https://github.com/trending?since=daily ), it can get your repository a lot more visibility. Whenever we made it to the trending list, we always got more visitors to our repository and website. There are trending lists for specific languages too. Many Twitter bots and other tools notify developers whenever there is a new repository that has made it to the trending list.

7) Ask for feedback from relevant communities


Communities such as ProductHunt, Hackernews, Reddit communities, etc may find your project useful. This can bring in more visitors and stargazers to your repository.

Target only the relevant communities. If you think majority of the members won’t find your project interesting, it is not a relevant community. Spamming can cause more harm than good. Also, it’s just not nice.

8) Grow a community

Start a community on Discord or Slack for your users and contributors to hang out. Communities can be helpful when the members are stuck with something and if they want to propose something new. If there is an active community, your future posts and announcements might get more reach. We created the community on Slack since most of the developers have a Slack account. Do not use lesser-known platforms for building your community as it would take an additional step for the person to join the community.

Play nice. Appreciate the work of others.

9) Add a public roadmap

A public roadmap helps your users and contributors understand where your project is headed. There are many tools available for creating public roadmaps but in most cases, GitHub projects will be more than enough for creating a simple yet effective public roadmap. We have created one using GitHub projects – https://github.com/ToolJet/ToolJet/projects/2

10) Twitter

Being active on posts related to your projects can create awareness, increase the number of followers on Twitter and drive more visitors to your repository. Make sure to link your repository on the project’s Twitter profile. Also, add a tweet button to your GitHub repository.

11) Respond to feedback

Open-source communities are usually very helpful and give a lot of feedback. Respond to all this feedbacks as the person has taken their valuable time to help you improve your project. Positive feedback helps you stay motivated while negative feedback helps you rethink.

Do not try to evade negative feedback, work on it if it aligns with your vision, otherwise politely explain.

12) Add relevant labels for contributors

Adding labels such as “good first issue” and “up for grabs” can attract more contributors to your repository. There are many platforms such as https://goodfirstissue.dev/ that scans for issues tagged with relevant labels to help contributors discover new repositories and issues to contribute to. Make sure you respond to contributors quickly. Contributors can be experienced developers as well as developers in the early stages of their careers or students. Try to help the first time contributors to help them onboard easily.

You landed on this article possibly because you have an interesting open-source project. I’d love to see your project. I’m available at navaneeth@tooljet.com and on Twitter.

Hope this article was helpful for you. We would really appreciate it if you can take a moment to give us feedback on ToolJet – https://github.com/ToolJet/ToolJet

Source link

Rare & useful Git commands summarized + solution to difficult scenarios while using Git

Git commands

  • git restore . – restores the files to the previous commit/ undos all the local changes that haven’t been commited.

  • git restore index.html – restores only that particular file to the recent commit/ undos all the local/uncommited changes for that file.

  • git reset --hard <hash code of the commit> – removes commits and goes back to the commit for that hash code

  • git reset --source <hash code> index.html>– removes commits and goes back to the commit for that hash code only for that particular file.

  • git commit --amend -m 'Your message'– helps re-write messages

  • git revert <hash code>– helps to roll back to a previous commit by creating a new commit for it. Doesn’t removes those commits from the log like git reset does.

  • git reflog– this can be useful to bring back deleted commits/files/changes. Use git reset <hash code of lost commit from reflog> to bring back rolled changes.

  • git reset HEAD~2– Helps roll back by 2 commits and unstage all the changes in those 2 removed commits.

  • git reset HEAD~2 --hard

  • git rebase (most useful command)- Reapply commits on top of another base tip. ex. git rebase master sets the branch at the tip of master branch

Moving commited changes to a new branch: (scenario: you accidently worked on master)

  • – Use git checkout -b new-feature
  • – Then roll back commits on master using git reset HEAD~1 --hard: (this command will roll back 1 commit)


  • Use git stash when you want to record the current state of the working directory and the index, but want to go back to a clean working directory. The command saves your local modifications away and reverts the working directory to match the HEAD commit.
  • The modifications stashed away by this command can be listed with git stash list, inspected with git stash show, and restored (potentially on top of a different commit) with git stash apply. Calling git stash without any arguments is equivalent to git stash push.

  • git stash

    • stashes/ saves the changes in the back of the project/ another directory of the project and the control moves back to the last working copy of the last commit.
    • saves the changes as a draft and moves back to code of the last commit
  • git stash push -m "Message"– Adds a message for the stash to the stash list

  • git stash list – lists all the draft changes in the back of the project

    Tip- The stash list stores all the stashes and each stashed feature/code has a unique index number to it. The last added stash always appears at the top with index 0.

  • git stash apply – applies the last stashed draft to our current working directory

  • git stash apply <index number> – applies the particular indexed stash to our current working directory

  • git stash drop <index number> – drops the stash out of the stash list with the particular index

  • git stash pop– pops the last draft changes back into the working directory/ on the working branch and that draft is then removed from the stash list

  • git stash pop <index number>– pops the draft change with the particular index back into the working directory/ on the working branch and that draft is then removed from the stash list

  • git stash clear– clears/ deletes all the draft changes stored

Moving commited changes to an already existing branch using cherry-pick:

  • git checkout feature-branch
  • git cherry-pick <hash code of that commit on master>
  • git checkout master
  • git reset HEAD~1 --hard (rolls back 1 commit)

Squashing commits-

  • git rebase -i <hash code of the commit above which all the commits need to be squashed>
    • i stands for interactive squash
    • opens up squashing in vim editor where you can pick or squash and update commit messages

Source link

Leading developer relations at a Silicon Valley Startup

Moving to San Fransisco and working for a startup in Silicon Valley has been a dream of mine for a while. After all, it is the startup tech hub of the world. As a fresh college graduate last year, I got a chance to make it a reality.

I want to tell the story of leading growth and developer relations at an early stage devtool startup called Fig. It all started with a Twitter DM, that lead to an interview, that a few weeks later, led to me moving across the country to San Fransisco.

The work

To set the scene, we were a small and scrappy team of 6 having recently raised a seed round of a few million dollars. And we just needed to execute.

The fast pace was no joke. In the first week, we spent about 12h a day at the office for onboarding, that then decreased to ~10.5h a day. The founders spent even more time working. We prioritized tasks on a week to week basis which led to me being able to work on a wide variety of things. In terms of the 3 pillars of developer advocacy, I was lucky to do work in every pillar.

What I did

What I did while I was at Fig can be broken down into five primary areas:

1. Discord Community

I helped grow the Fig Discord community by DMing new users and answering people’s questions. At one point, I sent a personalized DM to every single user that joined until it become unscalable.

I also livestreamed myself contributing to Fig’s open source repo weekly on the Discord to encourage our users to contribute and did a livestream with Nader + Fig’s CEO. I ended up helping them scale their Discord community from 1k members to over 2k over the two months I was there.

2. Twitter Account

I came up with our Twitter strategy and executed, posting a variety of tweets over an average of 5 times a week. Fig was a very visual product so I also recorded a lot of short videos and GIFs to show it off.

We also ran promotions and giveaways on Twitter. Fig was invite-only at the time so we partnered with popular developer influencers to give away hundreds of Fig invites and increase our userbase. I ended up growing their Twitter from 2k to over 4.5k followers.

3. Open Source Contributions

Another thing I did was manage our open source repo. Along with a part-time team member, I reviewed dozens of PRs with some back and forth with our contributors to make sure we were pushing quality code and following best practices.

I also submitted several PRs myself – a total of 63 commits and 19k lines of code. Some of this code was generated using CLI parsers and scripts that I wrote. You can look through my commits here.

4. Writing code

Yet another thing I helped with was the frontend for our Fig settings app. I redesigned and reimplemented it, fixed some bugs, and added features to make it easier for users to customize their settings.

I also helped with creating parsers for popular CLI tools like curl and GCC. I did this to programatically grab all the different options and arguments of a CLI tool and generate a completion spec so Fig could autocomplete for them.

5. Developer Experience

The final thing I did was help improve the overall developer experience of the product. I collected feedback from users each week through Discord chats, Twitter DMs, zoom calls, and my livestreams and relayed it over to our engineering team.

I also revamped our entire documentation to improve the UI, base it on the Divio system and write a few extra guides. Naturally, our docs used Next.js and were hosted on Vercel 🙂

Lessons Learned

Overall, my work contributed to us getting thousands of more members in our Discord and Twitter, which lead to more OSS contributors, more Github stars, and significantly more users. I’m proud of what I did at Fig.

It was a hectic and rewarding couple months and even though it didn’t work out in the end, I’m extremely thankful to Fig to giving me my start in the world of SF startups and for everything I learned.

They taught me how to ruthlessly prioritize and focus on the biggest pain points first. They taught me that a plan means nothing without solid execution. And they taught me that unexpected events occur and you need to be ready to deal with them.

Source link

Using BoltDB as internal database 💾

If you are looking for a small database for your fun project, I have something you might like. It is small, works as a key-value store and it’s pure go.

What is Bolt?

Bolt is a pure Go key/value store inspired by Howard Chu’s LMDB project. The goal of the project is to provide a simple, fast, and reliable database for projects that don’t require a full database server such as Postgres or MySQL.

Since Bolt is meant to be used as such a low-level piece of functionality, simplicity is key. The API will be small and only focus on getting values and setting values. That’s it.

How to use Bolt?


Bolt doesn’t need any install other than go get. It just works as a library. So your only need is adding it to go.mod file.

go get github.com/boltdb/bolt

Creating database

After you can create your database like this.

    db, err := bolt.Open("my.db", 0600, &bolt.Options)
    if err != nil 
    defer db.Close()

In here you can add timeout to your database. Or you can set readonly if you don’t need writing into. Just set them in &bolt.Options.

Writing into database

You need to create bucket first. After than you can write your key/value data into bucket. Also tx.CreateBucketIfNotExists([]byte) function is life saver.

    db.Update(func(tx *bolt.Tx) error 
        bucket, err := tx.CreateBucketIfNotExists([]byte("todo"))
        if err != nil 
            return fmt.Errorf("create bucket: %s", err)
        return bucket.Put([]byte("task-1"), []byte("Test BoltDB"))

If you have a bucket already you can write into it like this. Caution checking bucket is nil is important. Because if you try to use Put on nil bucket your program will panic.

    db.Update(func(tx *bolt.Tx) error 
        bucket := tx.Bucket([]byte("todo"))
        if bucket == nil 
            return fmt.Errorf("get bucket: FAILED")
        return bucket.Put([]byte("task-1"), []byte("Test BoltDB additions"))

Querying data from database

Querying is as simple as writing. Just call your bucket and ask it for your data.

    db.View(func(tx *bolt.Tx) error 
        b := tx.Bucket([]byte("todo"))
        if b == nil 
            return fmt.Errorf("get bucket: FAILED")
        // should return nil to complete the transaction
        return nil

And here is simple way to iterate on all keys in one bucket.

    db.View(func(tx *bolt.Tx) error 
        b := tx.Bucket([]byte("todo"))
        if b == nil 
            return fmt.Errorf("get bucket: FAILED")
        // we need cursor for iteration
        c := b.Cursor()
        for k, v := c.First(); k != nil; k, v = c.Next() 
            fmt.Println("Key: ", string(k), " Value: ", string(v))
        // should return nil to complete the transaction
        return nil


  • Bolt is a pure Go key/value store
  • Bolt is native go library
  • Bolt is supported everywhere which go is supported
  • Creating, Writing and Querying inside Bolt is really easy.

Simple and powerful toolkit for BoltDB

Storm is a simple and powerful toolkit for BoltDB. Basically, Storm provides indexes, a wide range of methods to store and fetch data, an advanced query system, and much more.
GitHub: https://github.com/asdine/storm

Future Readings

Source link

Getting Telescope's React Native App

Hello again! It’s me, Luigi again and today I am going to talk about the vision of Telescope getting a React Native App. This is a very big subject so I am going to go over as much as I can during this blog post. Things I am going to discuss are:

  • What is React Native?
  • React vs React Native?
  • What CLI we should use for our development?
  • Possible Telescope Goals
  • How can you get started to help?

What is React Native?

React Native?

React Native is a framework that allows programmers to write JavaScript code to build mobile applications. Some of the devices that React Native can build for currently is IOS, Android and Windows(Still new and buggy).

React vs React Native

React is a library in JavaScript that is used for building front-end web applications. React Native uses the React library to build the front-end for the application.

What React Native CLI Should we use for our Development?

React Native Expo

Expo CLI vs React Native CLI

When using React Native and starting a project you have the choice between two CLIs(Command Line Interfaces). You can choose between React Native’s or Expo’s. Both are awesome tools to build with but, each have their pros and cons.

Expo CLI


  • Fast development setup
  • Quickest way to start development with React Native
  • Live reload in Development
  • Library linking
  • Can test app on apple/android devices (Expo builds apps)
  • Can eject/convert back into a React Native CLI


  • Builds are done by Expo
  • Native modules are not supported
  • Expo apps are bigger because all libraries are included

React Native CLI


  • Builds are done locally
  • Native modules are supported
  • You have more control over app size


  • Slow development setup
  • No live reload in Development
  • No library linking (things need to be linked with react-native link)
  • Can’t turn into an expo app

Why I Recommend Expo as our Starting Point?

As someone who has worked with both, I think we should start development with Expo because it is easier/faster to setup for development, easier to develop in and can always be converted back into React Native CLI. This will encourage open source developers to learn and contribute because of how much easier it is to get involved with the project. Although React Native CLI provides native module support and control of the app size, I do not believe that those things are worth the frustration of working with the React Native CLI. If there are more pros or cons you know, please leave a comment in the description.


Target that is red

Starting Point

I think the first goal we should have for functionality of the app is to have blog posts with basic user information display correctly. Although it sounds easy we are designing/developing/testing a new app that requires a lot of configuration, learning and fixing. A more detailed goal list would be:

  • Splash Screen
  • App Icon
  • Published on app store
  • Get time line of blogs to display with user information correctly
  • Basic navigation setup

Once we hit that goal we can add user authentication and eventually slowly layer more features onto the app and create better work flows with the power of Expo! If you have any ideas to add for goals please comment below!

How can you get started to help?


Well now you’re wondering “How can I get involved?”. Well to start, you should review documentation and practice using Expo so when we start development you’re ready! The Expo Documentation is an amazing source for learning how to get started with react native! They teach you everything from installing react native to learning work flows. Once you feel a little comfortable you should make a small prototype! I feel like when I learn a new framework, I try to create a small project to help me understand how to practically use the tool. Some ideas are a TODO app, Book app that displays book information, News app and really anything that inspires you!

Source link

First PR of the Year:D

The feature I am interested to continue working on was born👶🏻 from a bigger set of features that had the purpose of integrating other types of communication and media to Telescope. I think that it would be a great addition to Telescope and it will make it a more “complete” aggregator:D Currently, Telescope aggregates blog posts from several blog feeds.

The bigger set of features included things like incorporating videos to the timeline and showcasing live streams with an ‘exclusive’ chat. You can learn more about it in this GitHub issue #1026.

The feature I decided to implement was incorporating videos to the timeline. Aggregating videos is exactly the same as aggregating blog posts, since platforms like YouTube implemented an RSS feed that you can use to aggregate it your own content aggregator (although this seems to be something of a hidden feature, because I only found out about this thanks to help of my professor💡).

I broke up the feature implementation in two PRs, one that would address the front-end (#2596), and another that would address the back-end (#2581).

These PRs were created some time ago, and I decided to finish them after the holidays were done👩🏻‍💻🐱‍💻. For this week, I focused mainly on the back-end PR, as it introduced the changes to actually aggregate the videos. If the changes of the back-end PR are merged, it would make the front-end PR easier to actually run (you have to do some manual set-up to do a demo of the front-end PR).

For the back-end, I applied some changes that one of the reviewers requested, as well as adding a few tests to verify that it was working as intended.

As of the time of this writing, the PR is pending on review. Hopefully, it will be accepted after the review is done 😀

Source link

Optimize Your Webserver by Installing a Single NGINX Module

In 2012, Google released version 1.0 of their PageSpeed modules for NGINX and Apache. It has gone largely unnoticed since then. The short of PageSpeed is that if you add it to your web server, you can configure it to optimize anything passing through it using techniques such as minification, format conversion, and even injecting scripts to lazy-load images. You can read more about what it does on the official site.

It sounded great in theory, but how properly install it with NGINX wasn’t obvious. While Google does publish scripts to help with the installation, it requires a non-trivial depth of knowledge to do right. After struggling with it for many hours, I wrote a guide for personal future reference.

I recently returned to those notes to entirely automate the process using GitHub Actions. The work is open-source and available on GitHub.


Run the following as root on a Debian-based machine:

sudo su
apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 8028BE1819F3E4A0
echo "deb https://nginx-pagespeed.knyz.org/dist/ /" > /etc/apt/sources.list.d/nginx-pagespeed.list
echo "Package: *" > /etc/apt/preferences.d/99nginx-pagespeed
echo "Pin: origin http://nginx-pagespeed.knyz.org/" >> /etc/apt/preferences.d/99nginx-pagespeed
echo "Pin-Priority: 900" >> /etc/apt/preferences.d/99nginx-pagespeed
apt update
apt install nginx-full # If NGINX is already installed, an `apt upgrade` works too
echo "pagespeed on;" > /etc/nginx/conf.d/pagespeed.conf
echo "pagespeed FileCachePath "/var/cache/pagespeed/";" >> /etc/nginx/conf.d/pagespeed.conf
echo "pagespeed FileCacheSizeKb 102400;" >> /etc/nginx/conf.d/pagespeed.conf
echo "pagespeed FileCacheCleanIntervalMs 3600000;" >> /etc/nginx/conf.d/pagespeed.conf
echo "pagespeed FileCacheInodeLimit 500000;" >> /etc/nginx/conf.d/pagespeed.conf
echo "pagespeed RewriteLevel CoreFilters;" >> /etc/nginx/conf.d/pagespeed.conf
systemctl reload nginx

The installation process is explained more thoroughly on the GitHub page if you’re curious.

Once that is done, you will have an active NGINX + PageSpeed installation that will receive the same updates as upstream NGINX. You can learn more about individual filters that you can enable in the documentation.

This post was originally shared on my Building Better Software Slower blog

Source link

Launching: Open-source one-page checkout

Introducing Medusa Express⚡️

What if you could buy and sell products via a link that sends you directly to check-out?

Wanted to share a cool new open-source project that we have been building for our headless commerce engine. Medusa Express is a one-page checkout flow allowing you to purchase a product directly via a URL-link which can be smart in cases of e.g. retargeting campaigns, outbound sales, special releases, or bloggers that want to sell easily to their followers.

Medusa Express automatically creates pages for the products in your ecommerce backend each of them optimized to make the purchasing experience as frictionless as possible, by bundling the checkout flow alongside the product.

Access it in the top link!

How it works:

  1. A customer visits a link e.g. https://medusa.express/basic-tshirt

  2. The customer selects the variant of the product they wish to buy

  3. Customer completes the checkout flow and the order is completedHope you might find it relevant!

Source link

How to set up Vue with Tailwind CSS and Flowbite

Vue.js is a popular front-end library used by websites such as Behance, Nintendo, Gitlab, Font Awesome, and more that you can use to build modern web applications.

By installing Tailwind CSS and Flowbite you can build your project even faster using the utility-first approach from Tailwind and the interactive components from Flowbite.

Install Tailwind CSS with Vue.js

Follow the next steps to install Tailwind CSS and Flowbite with Vue 3 and Vite.

  1. Create a new Vite project running the following commands in your terminal:
npm init vite my-project
cd my-project
  1. Install Tailwind CSS:
npm install -D tailwindcss postcss autoprefixer
npx tailwindcss init -p
  1. Configure the template paths inside the tailwind.config.js file:
module.exports = 
  content: [
    extend: ,
  plugins: [],

  1. Create a new ./src/index.css file and add the Tailwind directives:
@tailwind base;
@tailwind components;
@tailwind utilities;
  1. Import the newly created CSS file inside your ./src/main.js file:
import  createApp  from 'vue'
import App from './App.vue'

// add this
import './index.css'

  1. Install Flowbite by running the following command in your terminal:
npm install @themesberg/flowbite
  1. Require Flowbite as a plugin inside your tailwind.config.js file:
module.exports = 

    plugins: [

  1. Import the Flowbite JavaScript file inside your main ./src/main.js file:
import '@themesberg/flowbite';

Now you can start the local server by running npm run dev in your terminal.

Flowbite components in Vue.js

You can start using all of the components from Flowbite in your Vue.js project as long as you’ve properly followed the instructions above and installed both Tailwind CSS and Flowbite.

The interactive elements such as the dropdowns, modals, and tooltips will work based on the settings that you apply using the data attributes.

Here’s an example of a modal component that you can use by adding inside your App.vue template file:

  <img alt="Vue logo" src="./assets/logo.png" />
  <HelloWorld msg="Hello Vue 3 + Vite" />

  <!-- Modal toggle -->
  <button class="block text-white bg-blue-700 hover:bg-blue-800 focus:ring-4 focus:ring-blue-300 font-medium rounded-lg text-sm px-5 py-2.5 text-center dark:bg-blue-600 dark:hover:bg-blue-700 dark:focus:ring-blue-800" type="button" data-modal-toggle="default-modal">
    Toggle modal

  <!-- Main modal -->
  <div id="default-modal" aria-hidden="true" class="hidden overflow-y-auto overflow-x-hidden fixed right-0 left-0 top-4 z-50 justify-center items-center h-modal md:h-full md:inset-0">
      <div class="relative px-4 w-full max-w-2xl h-full md:h-auto">
          <!-- Modal content -->
          <div class="relative bg-white rounded-lg shadow dark:bg-gray-700">
              <!-- Modal header -->
              <div class="flex justify-between items-start p-5 rounded-t border-b dark:border-gray-600">
                  <h3 class="text-xl font-semibold text-gray-900 lg:text-2xl dark:text-white">
                      Terms of Service
                  <button type="button" class="text-gray-400 bg-transparent hover:bg-gray-200 hover:text-gray-900 rounded-lg text-sm p-1.5 ml-auto inline-flex items-center dark:hover:bg-gray-600 dark:hover:text-white" data-modal-toggle="default-modal">
                      <svg class="w-5 h-5" fill="currentColor" viewBox="0 0 20 20" xmlns="http://www.w3.org/2000/svg"><path fill-rule="evenodd" d="M4.293 4.293a1 1 0 011.414 0L10 8.586l4.293-4.293a1 1 0 111.414 1.414L11.414 10l4.293 4.293a1 1 0 01-1.414 1.414L10 11.414l-4.293 4.293a1 1 0 01-1.414-1.414L8.586 10 4.293 5.707a1 1 0 010-1.414z" clip-rule="evenodd"></path></svg>  
              <!-- Modal body -->
              <div class="p-6 space-y-6">
                  <p class="text-base leading-relaxed text-gray-500 dark:text-gray-400">
                      With less than a month to go before the European Union enacts new consumer privacy laws for its citizens, companies around the world are updating their terms of service agreements to comply.
                  <p class="text-base leading-relaxed text-gray-500 dark:text-gray-400">
                      The European Union’s General Data Protection Regulation (G.D.P.R.) goes into effect on May 25 and is meant to ensure a common set of data rights in the European Union. It requires organizations to notify users as soon as possible of high-risk data breaches that could personally affect them.
              <!-- Modal footer -->
              <div class="flex items-center p-6 space-x-2 rounded-b border-t border-gray-200 dark:border-gray-600">
                  <button data-modal-toggle="default-modal" type="button" class="text-white bg-blue-700 hover:bg-blue-800 focus:ring-4 focus:ring-blue-300 font-medium rounded-lg text-sm px-5 py-2.5 text-center dark:bg-blue-600 dark:hover:bg-blue-700 dark:focus:ring-blue-800">I accept</button>
                  <button data-modal-toggle="default-modal" type="button" class="text-gray-500 bg-white hover:bg-gray-100 focus:ring-4 focus:ring-gray-300 rounded-lg border border-gray-200 text-sm font-medium px-5 py-2.5 hover:text-gray-900 focus:z-10 dark:bg-gray-700 dark:text-gray-300 dark:border-gray-500 dark:hover:text-white dark:hover:bg-gray-600">Decline</button>

Source link