Adding a Search to my Website
How I added a search to my website using MeiliSearch and Netlify Functions
November 22, 2020
sam.bunting.dev
MeiliSearch
Netlify Functions
Gatsby
I updated my website a few weeks ago - and one of the most noticeable change is the search feature on the header.
I'd like to talk about how I added the search and how I managed to do it with my existing Gatsby website which is running as a Jamstack site on Netlify.
The motivation for me to do this was because I came across MeiliSearch - I've already spoken briefly about MeiliSearch and how you can use it - and you can read the blog post here.
A quick Summary/Overview
Right. Basically... Whenever my Gatsby site is built - it would create a JSON file containing all my blog posts, which then updates a MeiliSearch server hosted on DigitalOcean by a Netlify Function whenever it successfully deploys.
If you'd like a bit more of a detailed explanation, keep reading...
MeiliSearch Hosting
Because MeiliSearch is a hosted server - it needs a server to run on, and with my website being a static site, it needed to be hosted somewhere else.
The search is hosted on a DigitalOcean droplet which was really easy to setup thanks to MeiliSearch being available on the DigitalOcean Marketplace.
Once I had it up and running - I created an index of "posts".
Adding Documents
Documents (which in this case, would be blog posts) would need to be uploaded to the MeiliSearch server in a JSON file format. So in order for me to do that, I'd need to create a JSON file with the contents of my blog posts.
The way that my site works, is that blog posts are written in Markdown, and then used by Gastby and GraphQL to create the required pages for each post - there isn't really much of an existing database other than the GitHub repo...
So, to solve this issue, I modified my website's gatsby-node.js
file where while it was creating a page for each post, it would also add parts of it to an array. This was then exported as a JSON file. This would be done while the site is being built.
exports.createPages = async ({ graphql, actions }) => {
const { createPage } = actions
const blogPost = path.resolve(`./src/templates/blog-post.js`)
// GraphQL request
const result = await graphql(
`
{
allMarkdownRemark(
sort: { fields: [frontmatter___date], order: DESC }
limit: 1000
) {
edges {
node {
id
fields {
slug
}
frontmatter {
title
date
description
}
rawMarkdownBody
}
}
}
}
`
)
if (result.errors) {
throw result.errors
}
const posts = result.data.allMarkdownRemark.edges
const searchData = [];
posts.forEach((post, index) => {
// Create blog posts pages.
createPage({
path: post.node.fields.slug,
component: blogPost,
context: {
slug: post.node.fields.slug,
},
})
// Add search data
searchData.push({
id: post.node.id,
slug: post.node.fields.slug,
title: post.node.frontmatter.title,
date: post.node.frontmatter.date,
description: post.node.frontmatter.description,
body: post.node.rawMarkdownBody,
})
})
// Save JSON file export.
fs.writeFileSync('./netlify_functions/deploy-succeeded/search-data.json', JSON.stringify(searchData, null, 2));
}
The searchData
array contains everything MeiliSearch needs to make search requests. Meanwhile the createPage
is a Gatsby function which creates a detail page for each blog post.
The fs.writeFilteSync
method will create a JSON file of the JSON parsed result of searchData
.
Updating MeiliSearch
MeiliSearch would need to be updated every time there is a new blog post - otherwise it wouldn't be included within the search. The way to do this is to make a POST request to the MeiliSearch server (The DigitalOcean droplet) - and pass through our updated search-data.json
file.
However, I'm forgetful. And will forget to do things like that. So I needed a more automated approach...
Using Netlify Functions
Thankfully - Netlify has a way of running serverless functions using Netlify Functions - So I can take the search-data.json
file and upload it that way.
If you're thinking that this is like AWS Lambda - you'd be right. Because it technically is.
I wrote a Javascript script which makes an axios
request to POST the search-data.json
file.
const fs = require('fs');
const path = require("path");
const axios = require('axios');
require('dotenv').config()
exports.handler = async function(event, context) {
const data = fs.readFileSync(path.join(__dirname, 'search-data.json'));
let returnValue = {};
try {
const request = await axios({
method: 'post',
url: `${process.env.GATSBY_MEILISEARCH_HOST}/indexes/posts/documents`,
headers: {
'X-Meili-Api-Key': process.env.GATSBY_MEILISEARCH_APIKEY,
'Content-Type': 'application/json'
},
data : data
})
returnValue = { statusCode: 200, body: JSON.stringify(request.data) }
} catch (error) {
returnValue = { statusCode: 500, body: JSON.stringify(error) }
}
return returnValue;
}
I also added the MeiliSearch host URL and the API key as environment variables, as I don't really like keeping them in source control.
Of course - those aren't the real environment variables - the second isn't even a valid domain.
Triggering
In order to actually execute the function - it needs be triggered somehow. Netlify has quite a few ways you can trigger the functions, but in my case, I thought it would probably be best to trigger it whenever I've successfully deployed the site.
So - by naming the folder to deploy-succeeded/
meant that any time my website had been successfully deployed, it would trigger the function.
Making Search Requests
Right - so now we have everything up-to-date on the MeiliSearch server, it was time to pretty much make the search component.
I did this using the MeiliSearch SDK for Javascript - I added it just by simply running:
npm install meilisearch
From there, I created a new component, and added two states, one for the search results, and one for the query text.
Just a small note before I continue - while this code is used in the site's search component, it isn't the code in it's entirety. There is more to it such as how it appeared expanded and styling - but the core functionality is all there 😊
const [searchResults, setSearchResults] = useState(null);
const [query, setQuery] = useState('');
After that - I initialised MeiliSearch.
const client = new MeiliSearch({
host: process.env.GATSBY_MEILISEARCH_HOST,
apiKey: process.env.GATSBY_MEILISEARCH_APIKEY,
})
I also added the input field which is the actual search box.
return (
<input
type="text"
value={query}
onChange={(e) => setQuery(e.target.value)}
/>
)
So so far, what would happen is that if I was to enter text in the input, the query
state would change to match the value of the input.
So next would be using that value to make a search request. I did this using a React useEffect
hook.
useEffect(() => {
const index = client.getIndex('posts');
const executeSearch = async () => {
const search = await index.search(query);
setSearchResults(search.hits);
}
executeSearch();
}, [query])
The hook will get the index - and then search the index based on the query text. It would then update the searchResults
state with the hits
object containing all the search results. The hook would be updated whenever the query
value is changed (whenever the input has changed).
Finally - I can update the return
value to display those results:
return (
<>
<input ... />
{searchResults ? searchResults.map((result) => (
<Link to={`${result.slug}`} key={result.id}>
<div>
<h2>{result.title}</h2>
<p>{result.description}</p>
</div>
</Link>
)}
</>
)
And that should be it! Of course, like I said earlier - this isn't all of the search component, but I thought it would give you a bit of an insight on how it worked.
Final Thoughts
So that's how I did it - is it a bit overkill? Errr... Uh.... Maybe...
But overall - it was rather a straightforward process. I did however run into a few "trial and error" moments with the location of the search-data.json
file, as well as the names of environment variables (Gatsby need the environment variables to be prefixed with "GATSBY_").
I could have used gatsby-node
to just make a request to the MeiliSearch server when going through each post, but I decided not to. Being honest... I don't really know why... 🙃
Oh well. Netlify Functions and serverless functionality in general are fun still.