Introduction

There are many resources online around using the Feeds module in Drupal 8 to import data from a 3rd party API. Most of the resources, however, are specific to XML and RSS feeds. In this blog post, I wanted to document my journey of how I was able to get Feeds to import into Drupal 8 from a 3rd party API that provides JSON data.

About the 3rd Party API

The 3rd party API I chose to use for the purposes of this blog post is about one of my favorite TV shows that you’ve probably heard of, Breaking Bad.

The Breaking Bad API offers a few different endpoints, but I decided to use the Characters endpoint as the source that will be imported as Drupal Character nodes.

It appears the API just returns an array of JSON objects, which obviously are Breaking Bad characters and various attributes about the characters.

Characters API response

This API is free and open to use for the public and is great for testing purposes. Under normal circumstances and real projects, APIs will typically be locked down and only accessible with an authenticated user. Authentication with Feeds is outside the scope of this post.

Initial Drupal Setup

This assumes you have a running Composer-based Drupal installation already. I will not be covering how to do that, but will say for this example, I used the Drupal 8 Quickstart provided by DDev-Local for my local environment setup.

Feeds module

First of all, we need to install the Feeds module. This will be the module doing most of the work and processing to import the items from the JSON API into Drupal.

Feeds Extensible Parsers module

Because the Breaking Bad API returns to us a JSON response, we need a JSON parsing system in Drupal. We can use the Feeds Extensible Parsers module to parse the JSON.

We can simply run composer require drupal/feeds_ex, and Composer will make sure all dependency modules and other libraries get included, such as Feeds module and the JsonPath library.

Composer require feeds module

Note, because I’m using DDev-Local for my local environment, I needed to run:
ddev composer require drupal/feeds_ex

Pro tip: You’ll likely also want to include this Feeds patch:
https://www.drupal.org/project/feeds/issues/2850888

The patch addresses an issue where even if the feed import is unsuccessful, it requires the API URL be slightly changed to get around some caching that’s happening behind the scenes. How I solved it before this patch was to append a query string parameter to the API URL each time when I wanted to run the import, such as:
https://www.breakingbadapi.com/api/characters?abc

Go ahead and install the Feeds and Feeds Extensible Parsers modules on the module list page. Next, let’s create the Character content type we will use to import the Characters from the API into within Drupal.

Character Content Type

I created a new content type, Character, because, well, that makes sense for the type of content Drupal will be ingesting from the external API. We also need to create the fields for each of the pieces of character data that will be migrated into Drupal.

Character content type field configuration

To keep it simple, I’m mainly dealing with simple string fields and mappings. The Title field actually isn’t shown here, but is a required field on the node type. I’ve also added fields for Actor and Image path, which will pull from portrayed and img respectively in the API.

For more complex data sources, you may need to use the Feeds Tamper module to massage the imported data to fit your field structures, but that is outside the scope of this post.

Feeds Configuration

Next, let’s configure Feeds to be able to ingest the 3rd party JSON, parse it, and map it to the Drupal fields during the import process.

Feed Type

Simply add a new Feed Type. I called mine Breaking Bad Characters. Mainly, we’ll want to use the Download from url Fetcher, and the JsonPath Parser. We’ll also need to configure where we need the imported content to go. So, in this case, we can choose Node for processor, and Character for content type.

I’ve left all other settings defaulted, except for the one under Processor setting, and changed the Update existing contents setting to Update existing content. This controls what happens when the API data changes and how Drupal should handle the existing content the next time the feed import happens. It will be important during the mapping step to make sure we have a unique identifier for each feed, which is how it knows if it’s new or existing imported content.

Note: If you used the patch I mentioned earlier, you’ll also want to check the box for Always Download under Fetcher settings.

Click Save and add mappings, although we’re not going to add our mappings quite yet since we need to first review and test with the API a bit to figure out how we need to setup the mappings.

Feed

Under the Content tab, on the toolbar, there should be a Feeds menu item we can click into. This is where we can add the actual feed and tell Drupal which API URL to use.

Click Add feed. Give it a Title, such as Breaking Bad Characters, and a Feed URL. The Feed URL will be the URL of the 3rd party API, https://www.breakingbadapi.com/api/characters

Click `Save`. The Import of Save and Import button won’t work yet since we haven’t configured the field mappings, so let’s circle back to doing that now.

Jump back over to the Feed type you created earlier and click on the Mapping tab.
There are 2 steps here - Context and Field Mappings.

Context

Looking at the `Context` setting, it’s a bit unclear what value we need to enter. According to the help text, The base query to run, it sounds like what it’s looking for is the root or base of the data. Essentially, telling Feeds how to access the set of data we want to import. We talked about it previously that this API just returns an array of objects, which is exactly what we need, but we need to figure out how to query it with the JsonPath syntax.

JsonPath Syntax

We can use a tool like https://jsonpath.herokuapp.com/ to plug in the JSON and figure out how to use JsonPath to get to the data we need.

Looking at the JsonPath documentation, you would think you could just use the dollar sign ($) syntax, since it says that provides the root object/element.

In the end though the online tool, documentation and debugging, I figured out the Context setting I needed for this case was $.*.

Field Mappings

Now, we can configure Feeds to decide which value pulled in from the JSON maps to which field on the Drupal Character content type.

Let’s start with the easy one, the character name, and import into the Character node’s title field.

Choose Title (title) for the target. Under source, click New source.... In the textbox that pops up, enter name. That maps over to the name key in the API JSON for each of the character objects in the array.

Title source mapping

We need that one because as you know, Title field is a required field on nodes. If we didn’t do that mapping, the feed import would fail and Drupal would complain that the Title field is required.

Then, we add the mappings for the other fields, including the unique identifier one I mentioned earlier. This uses the char_id as the value for Feed GUID, which is how feeds will track if the content is new or existing during the import process.

Source mappings

Run the Import!

Now that we have Feeds configured with the API URL, along with the context and field mapping settings, we should be good to run the import. Jump back over to the Breaking Bad Characters feed we set up earlier, and under Operations, click Import, and Import again to perform the import.

Imported characters

If we visit the content overview page, we can see we now have Character nodes in the list. Editing one of the character nodes, we can see the data that was imported into Drupal for the character matches the data provided by the API and how we mapped it over using Feeds.

Characters import from JSON into Drupal

It's a Wrap!

This is probably a very simple example of using Feeds to import JSON in Drupal, but I mainly wanted to highlight the problem areas that did occur in my case in an effort to help others who face the same issues.