Selenium and PHP – Write a snapshot comparing tool

The Agenda

This tutorial explains how to write a script that would tale a list of URLs, and create a snapshot images with them. Next time when you run the script it will compare the images and alert if there are some differences.

This is very helpful while active developing  because it could identify unexpected changes in the pages and potential errors.

Run Selenium standalone server

Download Selenium Standalone Server from here

You could put the jar file wherever you want but I personally prefer to keep things more organized and to put it in my home directory.

cp selenium-server-standalone-3.141.59.jar ~/.

Also we could create a shell script to start the server.

./start-selenium.sh

java -jar selenium-server-standalone-3.141.59.jar

Start the server ./start-selenium.sh and check if the server is running by navigating the browser to : http://localhost:4444/wd/hub/static/resource/hub.html

Install PHP composer.

Next step will be to install composer in our home folder, since we will need it to install the webDriver. Create a new shell scripts with the following contents:

php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
php -r "if (hash_file('sha384', 'composer-setup.php') === '48e3236262b34d30969dca3c37281b3b4bbe3221bda826ac6a9a62d6444cdb0dcd0615698a5cbe587c3f0fe57a54d8f5') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
php composer-setup.php
php -r "unlink('composer-setup.php');"

then add execute rights chmod +x script.sh and run the script. This will install the composer.

install webDriver

composer require facebook/webdriver

Install chrome driver

brew cask install chromedriver

And now let’s write some PHP code.

<?php

 
use Facebook\WebDriver\Remote\DesiredCapabilities;
use Facebook\WebDriver\Remote\RemoteWebDriver;
use Facebook\WebDriver\WebDriverBy;
use Facebook\WebDriver\WebDriverDimension;
use Facebook\WebDriver\WebDriverExpectedCondition;
use Facebook\WebDriver\WebDriverPoint;
 
$composer_dir = '/Users/toninichev/composer';
require_once $composer_dir . '/vendor/autoload.php';
 
$host = 'http://127.0.0.1:4444/wd/hub'; // this is the default port
 
$driver = RemoteWebDriver::create($host, DesiredCapabilities::chrome());
 
// Set size
$driver->manage()->window()->setPosition(new WebDriverPoint(0,0));
$driver->manage()->window()->setSize(new WebDriverDimension(1280,800));

 
function takeScreenshot($driver, $url, $id) {
    // Navigate to the page
    $driver->get($url);
    
    $driver->wait(3);
    
    // Take a screenshot
    $driver->takeScreenshot(__DIR__ . "/screenshots/scr" . $id . "-tmp.png");
}

$urls = array('https://www.toni-develops.com/', 'https://www.toni-develops.com/2017/04/27/git-bash-cheatsheet/', 'https://www.toni-develops.com/webpack/', 'https://www.toni-develops.com/algorithms/');
//$urls = array('https://www.toni-develops.com/', 'https://www.toni-develops.com/2017/04/27/git-bash-cheatsheet/');

$html = '';

for($i = 0; $i < count($urls); $i++) {
    takeScreenshot($driver,$urls[$i], $i);   

    $a = sha1_file(__DIR__ . "/screenshots/scr" . $i . "-tmp.png");
    $b = file_exists(__DIR__ . "/screenshots/scr" . $i . ".png") ? sha1_file(__DIR__ . "/screenshots/scr" . $i . ".png") : null;
    
    if($a == $b || $b == null) {
        $className = "match";
    }else {
        $className = "noMatch";
    }

    rename(__DIR__ . "/screenshots/scr" . $i . "-tmp.png", __DIR__ . "/screenshots/scr" . $i . ".png");
    $html .= '<div class="picWrapper ' .$className . '"><div><input type="test" value="' . $urls[$i] . '" readonly /></div><img  src="screenshots/scr' . $i . '.png"/></div>';
}


// Close the Chrome browser
$driver->quit();
?>

<html>
<head>
<style>
    .picWrapper input {
        width: 100%;
        text-align:center;
    }

    .picWrapper {
        text-align: center;
    }

    .match {
        background: green;
    }

    .noMatch {
        background: red;
    }


</style>
</head>
<body>

    <?php
        echo $html;
    ?>
      
</body>
</html>

 

Maximum Subarray

[tabby title=”Task”]

Given an integer array nums, find the contiguous subarray (containing at least one number) which has the largest sum and return its sum.

Example:

Input:

 [-2,1,-3,4,-1,2,1,-5,4],

Output:

 6

Explanation:

 [4,-1,2,1] has the largest sum = 6.

This problem was taken from Leetcode

[tabby title=”Solution”]

The solution:

Brut force.

The straight forward solution could be to iterate through all elements in the array and calculate the subarray values and compare them.

In the example above[-2,1,-3,4,-1,2,1,-5,4] we will do:

starting index: 1
-2 = -2
-2, 1 = -1
-2, 1, -3 = -4
-2, 1, -3, 4 = 0
-2, 1, -3, 4, -1 = -1
-2, 1, -3, 4, -1, 2 = 1
-2, 1, -3, 4, -1, 2, 1 = 2
-2, 1, -3, 4, -1, 2, 1, -5 = -3
-2, 1, -3, 4, -1, 2, 1, -5, 4 = 1

starting index: 2
1 = 1
1, -3 = -2
1, -3, 4 = 2
1, -3, 4, -1 = 1
1, -3, 4, -1, 2 = 3
1, -3, 4, -1, 2, 1 = 4
1, -3, 4, -1, 2, 1, -5 = -1
1, -3, 4, -1, 2, 1, -5, 4 = 3

starting index: 3
-3 = -3
-3, 4 = 1
-3, 4, -1 = 0
-3, 4, -1, 2 = 2
-3, 4, -1, 2, 1 = 3
-3, 4, -1, 2, 1, -5 = -2
-3, 4, -1, 2, 1, -5, 4 = 2

starting index: 4
4 = 4
4, -1 = 3
4, -1, 2 = 5
4, -1, 2, 1 = 6
4, -1, 2, 1, -5 = 1
4, -1, 2, 1, -5, 4 = 5

… and so on till the last element in the array.

So the winner clearly is 4, -1, 2, 1 = 6, but this approach will take a lot of repetitions. Interestingly there is a linear solution called: Kadane’s algorithm.

Using Kadene’s algorithm.

The basic idea of this algorithm is to break the array into a sets of mutually exclusive sets, calculate their sums and find the largest one.

First let’s look closely of what we are doing to find the maximum sum using brut force. We are splitting the array to a sets of all possible contiguous sub arrays and we calculate their sum. This means that:
– if the array contains only negative values we don’t really need to split the array cause the answer will be the largest value in the array. i.e. [-1,-5,-3] = -1 (the one close to 0)
– if this is a mixed array with negative and positive values the max sum of contiguous sub array will be > 0 so we could ignore any case where the sum is negative.

This way we could iterate through each element of the array nums[i]  where i is the index of the element in the array (starting with the first one nums[0]), and calculate the sum (let’s call it max_here = max_here + nums[i] ).
If we get a negative result we already know for sure that this is not what we are looking for and we set up max_here to the next element in the array max_here = nums[i]

So in the example above: [-2,1,-3,4,-1,2,1,-5,4]
We are starting by setting up both params to the first element in the array: max_here =max_so_far = nums[0] . We are going to use max_so_far to store the maximum sum discovered so far, and max_here to calculate the maximum sum so far. Once again if the max_sum is negative, we just set it up to be equal to the next element in the array nums[i] so on the next iteration max_sum = nums[i-1] + nums[i]

Starting with setting up max_here = max_so_far = nums[0] = -2

i nums[i]    action described max_here max_so_far
1 1  sum = max_here + nums[1], which is:
sum = -2 + 1 = -1 which is smaller than nums[1] so max_here = nums[1] = 1 (line 10 in the code snipped below)
and since max_here > max_so_far,  max_so_far = max_here =1 (line 11)
1 1
2 -3  sum = max_here + nums[2] which is:
sum = 1 + (-3) = - 2 which is bigger than nums[2] which is -3 so max_here = sum = -2But max_here is smaller than max_so_far so max_so_far stays equal to 1
-2 1
3 4  sum = -2 + 4 = 2 which is < than 4 so  max_here = nums[4] = 4 which is > max_so_far so max_so_far = max_here = 4 4 4
4 -1   sum = 4 - 1 = 3 > – 1 so
max_here = sum = 3
3 4
5 2  sum = 3 + 2 = 5
which is > than nums[5] = 2 so
max_here = max_so_far = sum = 5
5 5
6 1  sum = 5 + 1 = 6
which is > than nums[6] = 1 so
max_here = max_so_far = sum = 6
6 6
7 -5  sum = 6 + (-5) = 1
which is > than nums[7] = -5 so
max_here = 1
1 6
8 4  sum = 1 + 4 = 5
which is > than nums[8] = 4 so
max_here = 5 but
max_here < max_so_far so
max_so_far stays the same: 6 which is the maximum sum here.
5 6

 

/**
 * @param {number[]} nums
 * @return {number}
 */
var maxSubArray = function(nums) {
    var max_here = max_so_far = nums[0];
    

    for(var i=1;i < nums.length; i ++) {
        max_here = Math.max(max_here + nums[i], nums[i]);
        max_so_far = Math.max(max_so_far, max_here);
    }    

    return max_so_far;
};

 

 

[tabbyending]

Adding Jest and Enzyme and creating snapshot tests

branch-name:  
Click To Copy

 

In this tutorial we will learn how to install Jest and Enzyme, and create Snaphot tests.

If you are not familiar with these tools please read the links first since we will focus mainly on installing and using the tools.

The basic idea is that we are going to create a test cases, that will render the component, and create a snapshot of it. Next time when we run the test we could compare with the snapshot, making sure that either the component is not changed, or it renders as expected.

Setting up Jest and Enzyme.

We will need some extra packages in order to make Jest work with Enzyme.

Installing the packages.

  • enzyme – the actual enzyme testing utility.
  • enzyme-adapter-react-16 – the react adapter for enzyme. There are different adapters for different React versions, and since we are using react 16 we will install this version. A full list of adapter packages could be found here
  • enzyme-to-json – a serializer for the components.
yarn add jest enzyme enzyme-adapter-react-16 enzyme-to-json --dev

In addition we have to also install:

yarn add babel-jest --dev

  • babel-jest – allows Jest to use babel to transpile JS.

Before using Enzyme we have to configure it first. Create file with the following contents:

src/setupTests.js

import { configure } from 'enzyme';
import Adapter from 'enzyme-adapter-react-16';
configure({ adapter: new Adapter() });

This tells Enzyme, to run with React using the adaptor we just installed. Make sure this file is included before using Enzyme. Open package.json and add the following config:

./package.json

...
  },
  "jest": {
    "setupFiles": [
      "./src/setupTests.js"
    ]
  },
...

Now we are almost ready to write our first test. Header component is perfect candidate since it is not connected to the redux store.

Create new file in the ./src/components/Header folder and let’s

Add a Snapshot Test

./src/components/Header/index.test.js

import React from 'react';
import { shallow } from 'enzyme';
import Header from './index';
import toJson from 'enzyme-to-json';

describe('Testing Header component', () => {
    it('renders as expected!!!', () => {
      const wrapper = shallow(
        <Header title="Title test" />
      );
     expect(toJson(wrapper)).toMatchSnapshot();
   });
});

 unfortunately if we do yarn test it will fail with error code, since Jest doesn’t know how to deal with anything different than Java Script.
But there is an easy fix: we could map CSS to a JS file that Jest understands. Create empty file in

./src/__mocks__/styleMock.js


and set up Jest to use the mapping:

./package.json

...
  "jest": {
    "setupFiles": [
      "./src/setupTests.js"
    ],
    "moduleNameMapper": {
      "\\.(css|less|scss)$": "<rootDir>/src/__mocks__/styleMock.js"
    }    
  },
...

Now let’s run yarn test and we should see something like this:

yarn test
yarn run v1.12.3
$ jest
 PASS  src/components/Header/index.test.js
  Testing Home component
    ✓ renders as expected!!! (15ms)

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   1 passed, 1 total
Time:        2.194s
Ran all test suites.
✨  Done in 3.05s.

Perfect ! Our first snapshot test is done.

Testing renderer component

src/components/Home/renderer.test.js

import React from 'react';
import { shallow, mount } from 'enzyme';
import toJson from 'enzyme-to-json';
import Component from './renderer';

describe('Testing Home component', () => {
    it('renders as expected!!!', () => {
      const wrapper = shallow(<Component styles={{}} title="MyTest" />);

     expect(toJson(wrapper)).toMatchSnapshot();
     expect(wrapper.contains(<div>MyTest</div>)).toBe(true);
   });
});

Also we added expect(wrapper.contains(<div>MyTest</div>)).toBe(true); which looks for certain things into the component.

Testing Redux connected component.

./src/components/About/index.test.js

import React from 'react';
import { mount } from 'enzyme';
import toJson from 'enzyme-to-json';
import { Provider } from 'react-redux';
import reducers from '../../reducers';
import { createStore} from 'redux';
import About from './index.js';

const store = createStore(reducers, {});
let wrapper;
describe('Testing About component', () => {

  beforeEach(() => {
    // Runs a function before each of the tests in this file runs
    wrapper = mount(
      <Provider store={store}>
        <About userName="Toni" />
      </Provider>
    );
  });

  it('renders as expected', () => {
   expect(toJson(wrapper)).toMatchSnapshot();
  });

  it('expect to have state.userName set to John', () => {   
   wrapper.setState({userName: "John"});
   expect(wrapper.state('userName')).toEqual("John");
  });  
});

since we wrapped MyComponent with Provider component, we have to mount the component that will render it in full instead of doing shallow rendering (like in the previous example). Otherwise shallow will render only the Provider component.

Now the two tests could be executed together yarn test

 PASS  src/components/About/index.test.js
  Testing About component
    ✓ renders as expected (56ms)
    ✓ expect to have state.userName set to John (14ms)

Test Suites: 1 passed, 1 total
Tests:       2 passed, 2 total
Snapshots:   1 passed, 1 total
Time:        3.102s
Ran all test suites matching /.\/node_modules\/jest\/bin\/jest.js|src\/components\/About\/index.test.js/i.

Adding more tests.

./src/components/Greetings/renderer.test.js

import React from 'react';
import { mount } from 'enzyme';
import toJson from 'enzyme-to-json';
import reducers from '../../reducers';
import { createStore} from 'redux';
import Component from './renderer';

const store = createStore(reducers, {});
let wrapper;
describe('Testing Greetings component', () => {

  beforeEach(() => {
    wrapper = mount(
        <Component styles={{}} store={store} />
    );
  });

  it('renders as expected', () => {
    // to snapshot test
    expect(toJson(wrapper)).toMatchSnapshot();
  });

  it('textbox does not exist', () => {  
    expect(wrapper.find('input').exists()).toBe(false);
  });

  it('textbox exists after h2 click', () => {  
    // simulate click on the h2 tag trigering props change and visualizing the input text box
    wrapper.find('h2').simulate('click');
    expect(wrapper.find('input').exists()).toBe(true);
  });    

  it('textbox values change properly', () => {    
    // text box value tests
    expect(wrapper.find('input').props().value).toBe('No Name');
    wrapper.find('input').props().value = 'test';
    expect(wrapper.find('input').props().value).toBe('test');
  });    

});
  • beforeEach – Runs a function before each of the tests in this file runs. If the function returns a promise or is a generator, Jest waits for that promise to resolve before running the test.
  • here is another way of passing the store directly to the component.
branch-name:  
Click To Copy

Setting up production server.

branch-name:  
Click To Copy

 

Developing config is adding a lot of heavy processing in order to have source mapping working and debugging easy, but we don’t need this in production. All we need is speed. Let’s create another config that we are going to use for production build.

There are different ways to achieve this. One way for example is to create a common config and one for production and one for development. Then use webpack-merge plugin to combine the configs like it was explained here , or we could simply create another Webpack config for running the production server, but then we have to manage multiple configs which is far from perfect, so let’s use the same concept like before and extend (and override) the base config with production one.

Creating production config.

Add

./webpack.prod.config.js

const webpack = require('webpack');
let config = require('./webpack.base.config.js');

config.mode = "production";
config.devtool = "";

config.entry = [
    './src/index.js', 
];

module.exports = config;

also lets modify the base config to use plain JavaScript syntax since Webpack won’t know how to run it and at this point there is no Babel teanspiler to convert the code to plain JavaScript.

./webpack.base.config.js

const getEnvironmentConstants = require('./getEnvironmentConstants');
const webpack =require('webpack');

module.exports = {
  mode: 'development',
  devtool: 'eval-source-map',
  entry: [
    '@babel/polyfill',    
    './src/index.js',
  ],
  output: {
    filename: '[name]-bundle.js',
    publicPath: '/dist/',
  },  
  module: {
    rules: [
      {
        test: /\.js$/,
        exclude: /node_modules/,
        use: {
          loader: "babel-loader"
        }
      },

      // SCSS
      {
        test: /\.scss$/,
        use: [
          'style-loader',
          {
            loader: 'css-loader',
            options: {
              modules: true,
              importLoaders: 2,
              localIdentName: '[folder]-[local]',
              sourceMap: true
            }
          },
          {
            loader: 'postcss-loader',
            options: {
              plugins: () => [require('autoprefixer')()],
              sourceMap: true
            },
          },
          {
            loader: 'sass-loader',
            options: {
              outputStyle: 'expanded',
              sourceMap: true
            }
          }
        ],
      },
      // images
      {
        test: /\.(png|jp(e*)g|svg)$/,  
        use: [{
            loader: 'url-loader',
            options: { 
                limit: 8000, // Convert images < 8kb to base64 strings
                name: 'images/[hash]-[name].[ext]'
            } 
        }]
      },
      //File loader used to load fonts
      {
        test: /\.(woff|woff2|eot|ttf|otf)$/,
        use: ['file-loader']
      }                    
    ]
  },
  plugins: [
    new webpack.DefinePlugin({ 'process.env' : getEnvironmentConstants() } )
  ]
};

Creating production build.

Edit package.json and add clean script to remove the dist folder and build-prod script.

./package.json

...
"scripts": {
  "start-cli": "webpack-dev-server --hot --history-api-fallback --config webpack.cli.config.js",
  "start-api": "babel-node server-api.js",
  "start-middleware": "babel-node server-middleware.js",
  "clean": "rm -rf ./dist",
  "lint": "eslint .",
  "build-dev": "webpack --mode development",
  "build-prod": "webpack --config webpack.prod.config.js",
},
...

what we just did:
– we added a cleanup script clean that will clean up the production build directory (line 6)
– we added a production build script (line 9)

Now we could clear the ./dest folder by running yarn clean then do the production build yarn build-prod and have production bundle ready.

Creating Express production server.

There are different ways to serve your files once the production build is done. One way is to use nginx server which I might create a tutorial of how to set up later, or we could use the express server.

Let’s create an Express config and run a simple prod server. Create the prod.server.js with the following contents:

./prod.server.js

const express = require('express');
const path = require('path')
const app = express();
const port = 8080;

app.use('/dist', express.static('dist'));
app.get('*', (req, res)=>{
  res.sendFile(path.join(__dirname, 'index.html'));
})
app.listen(port, () => console.log(`Example app listening on port ${port}!`))

 what we just did:
– we created an Express server
– we told the server to serve every static file like JS, CSS, images from the /dist folder (line 6)
– everything else that is not matching the static content falls into ‘*’ and will serve index.html

Looks familiar? It’s pretty similar to that we did in the previous chapters.

Let’s add a production server start script:

./package.json

"scripts": {
  "start-cli": "webpack-dev-server --hot --history-api-fallback --config webpack.cli.config.js",
  "start-api": "babel-node server-api.js",
  "start-middleware": "babel-node server-middleware.js",
  "clean": "rm -rf ./dist",
  "lint": "eslint .",
  "build-dev": "webpack --mode development",
  "build-prod": "webpack --config webpack.prod.config.js",
  "run-prod-server": "node ./prod.server.js",
  "build-and-run-prod-server": "yarn clean && yarn build-prod && yarn run-prod-server"
},

 what we just did:
– we created `run-prod-server` that will simply start the express server with will statically serve the contents of the /dist folder where build-prod script is dumping the bundle.
– To make things easier we also created a new script entry (line 10) that does all necessary steps to run production server:
– calls yarn clean script to remove the ./dist folder before rebuilding
– calls yarn build-prod to make a fresh production buid
– tells node to start using ./prod.server.js that we just created as an entry point, which starts the Express server.

Give it a try yarn run-prod-server and observe the ./dist folder. It will disappear for a brief moment, then it will re-appear with the fresh content and once you see this message in the console ‘example app listening on port 8080!’ you are ready to hit the browser url.

If everything went well, you should see the site running, using the production Express server.

Extracting CSS in a separate files.

Right now, if we look at the ./dist folder we will see only *.js and the font file that we added, which means that all CSS is embedded into Javascript files, which is ok for development, but far from ideal for production. Now the browser can’t cache the CSS, and we will see issues with styling the components since it will take time for the CSS to be applied. To solve this problem we have to extract the CSS from JS and load it separately.

Adding mini-css-extract-plugin.

And there is a plug-in for this called mini-css-extract-plugin. Let’s install it.

yarn add mini-css-extract-plugin

and let’s tell Webpack to use it only with the production build:

./webpack.prod.config.js

const webpack = require('webpack');
let config = require('./webpack.base.config.js');
const MiniCssExtractPlugin = require("mini-css-extract-plugin");

config.mode = "production";
config.devtool = "";

config.entry = [
    './src/index.js', 
];

config.module.rules[1].use[0] = MiniCssExtractPlugin.loader;

config.plugins = [ ... [new MiniCssExtractPlugin({
        // these are optional
        filename: "[name].css",
        chunkFilename: "[id].css"
    })], 
    ... config.plugins
];

module.exports = config;

 what we just did:
– we replaced ‘style-loader’ with mini-css-extract-plugin to extract all CSS files form JS bundle (line 12)
– by default even without any parameters mini-css-extract-plugin will create two bundle types: one with the main.css and the rest per component base. Which means that each component will have their own CSS bundle, dynamically loaded when the component is rendered. For a large project this will dramatically reduce the initial CSS load.

You could play with the optional parameters: filename and chunkFilename and see how this will affect generated CSS files names. If for example you would like to rename the main file to global-main.css you could set filename: "global-[name].css" or if you want to change the chunk names you could do this: chunkFilename: "[id].css" . Parameters in the brackets are automatically replaced  by Miini-css-extract-plugin. [id] – will be replaced with the chunk id # for example.

Rebuild the project:

yarn build-prod

And let’s look at ./dist folder. Below every JS bundle we should be able to see the corresponding CSS for this module. Now it looks better. Run the production server build-and-run-prod-server load the project and observe the net tab while navigating in different site sections. You will see the css files loading for each component once you navigate to a new section. But we are missing the main.css which is not loading on demans. Let’s add a tag to load it:

<!DOCTYPE html>
<html>
    <head>
        <meta charset="UTF-8">
        <title>Babel Webpack Boilerplate</title>
        <link rel="stylesheet" type="text/css" href="dist/main.css">
    </head>
    <body>
        <div id="root"></div>
        <script type="text/javascript" src="dist/main-bundle.js"></script>
    </body>
</html>

Run the production server again and the app should look like before.

But there is another problem: click to see the file content’s and you will notice that the CSS is not minified.

Minimizing the CSS.

Let’s use optimize-css-assets-webpack-plugin

yarn add optimize-css-assets-webpack-plugin --dev

./webpack.prod.config.js

const webpack = require('webpack');
let config = require('./webpack.base.config.js');
const MiniCssExtractPlugin = require("mini-css-extract-plugin");
const OptimizeCSSAssetsPlugin = require("optimize-css-assets-webpack-plugin");

config.mode = "production";
config.devtool = "";

config.entry = [
    './src/index.js', 
];

config.module.rules[1].use[0] = MiniCssExtractPlugin.loader;

config.plugins = [ ... [new MiniCssExtractPlugin({
        // these are optional
        filename: "[name].css",
        chunkFilename: "[id].css"
    })],
    new OptimizeCSSAssetsPlugin({}),   
    ... config.plugins
];

module.exports = config;

Rebuild production and run the prod server and if you open some of the CSS file contents you will see that they are now minified.

Just what we need for production build!

branch-name:  
Click To Copy

 

 

Adding environment variables file.

branch-name:  
Click To Copy

 

Although Wepack already comes with two modes: development and production, It comes handy to have different environment variables for development and production and to store them in different files.

Let’s create .env file which will be our development environment variable file.

Creating the .env file

./.env

APP_NAME=Webpack React Tutorial
GRAPHQL_URL=http://localhost:4001/graphql

As it looks like .env is not a standart JavaScriupt object so we can’t use the variables out of the box. We will need two modules Dotenv and Dotenv-expand.

Adding and configuring Dotenv to load variables into process.env.

Dotenv is module that loads environment variables from a .env file into process.env.

yarn add dotenv dotenv-expand

Let’s give it a try and print out GRAPHQL_URL in the backend. Let’s do this using server-api.js config. All that we need to do is to ‘tell’ Dotenv module to load variables into process.env

./src/server-api.js

import WebpackDevServer from 'webpack-dev-server';
import webpack from 'webpack';
import config from './webpack.api.config.js';
require('dotenv').config();

console.log(">>>" + process.env.GRAPHQL_URL);

const compiler = webpack(config);
const server = new WebpackDevServer(compiler, {
  hot: true,
  publicPath: config.output.publicPath,
  historyApiFallback: true
});
server.listen(8080, 'localhost', function() {});

Run the project and in the backend console you will see GraphQL url printed out.

Now, let’s create getEnvironmentConstants helper method that will use Dotenv to load variables, and in addition we will add a filter that will load only these variables to the front end that we specify in frontendConstants . This way important variables that we need in the backend like passwords to the database won’t be exposed in the source code.

./getEnvironmentConstants.js

const fs = require('fs');

// Load environment variables from these files
const dotenvFiles = [
  '.env'
];


// expose environment variables to the frontend
const frontendConstants = [
  'APP_NAME',
  'GRAPHQL_URL'
];

function getEnvironmentConstants() {
  
  dotenvFiles.forEach(dotenvFile => {
    if (fs.existsSync(dotenvFile)) {
      require('dotenv-expand')(
        require('dotenv').config({
          path: dotenvFile,
        })
      );
    }
  });
  
  const arrayToObject = (array) =>
  array.reduce((obj, item, key) => {
    obj[item] = JSON.stringify(process.env[item]);
    return obj
  }, {})

  return arrayToObject(frontendConstants);      
}

module.exports = getEnvironmentConstants;

Once we have the object in place we could use the DefinePlugin to pass them to the frontend.

Adding the DefinePlugin.

if we go back in this tutorial we will remember that we showed how to configure Webpack in three different ways: using CLI, the webpack API and the server middleware.
Now the best place to add DefinePlugin will be in webpack.base.config so all three Webpack set-ups will take advantage of it. Let’s import getEnvironmentConstants and pass it as a parameter to DefinePlugin.

./webpack.base.config.js

import getEnvironmentConstants from './getEnvironmentConstants';
import webpack from 'webpack';

module.exports = {
  mode: 'development',
  devtool: 'eval-source-map',
  entry: [
    '@babel/polyfill',    
    './src/index.js',
  ],
  output: {
    filename: '[name]-bundle.js',
    publicPath: '/dist',
  },  
  module: {
    rules: [
      {
        test: /\.js$/,
        exclude: /node_modules/,
        use: {
          loader: "babel-loader"
        }
      },

      // SCSS
      {
        test: /\.scss$/,
        use: [
          'style-loader',
          {
            loader: 'css-loader',
            options: {
              modules: true,
              importLoaders: 2,
              localIdentName: '[folder]-[local]',
              sourceMap: true
            }
          },
          {
            loader: 'postcss-loader',
            options: {
              plugins: () => [require('autoprefixer')()],
              sourceMap: true
            },
          },
          {
            loader: 'sass-loader',
            options: {
              outputStyle: 'expanded',
              sourceMap: true
            }
          }
        ],
      },
      // images
      {
        test: /\.(png|jp(e*)g|svg)$/,  
        use: [{
            loader: 'url-loader',
            options: { 
                limit: 8000, // Convert images < 8kb to base64 strings
                name: 'images/[hash]-[name].[ext]'
            } 
        }]
      },
      //File loader used to load fonts
      {
        test: /\.(woff|woff2|eot|ttf|otf)$/,
        use: ['file-loader']
      }                    
    ]
  },
  plugins: [
    new webpack.DefinePlugin({ 'process.env' : getEnvironmentConstants() } )
  ]
};

 what we just did:
– (line 1 and 2) we imported getEnvironmentConstants and Webpack since we will need it to instantiate the plug in.
– (line 72-74) we added the plug in.

We have to do one more change in order to have the plug-in working for all Webpack configs:

./webpack.api.config.js

const webpack = require('webpack');
let config = require('./webpack.base.config.js');

config.entry = [
  '@babel/polyfill',    
  './src/index.js',
  'webpack/hot/dev-server',
  'webpack-dev-server/client?http://localhost:8080/',       
];

config.plugins = [... [new webpack.HotModuleReplacementPlugin()], ... config.plugins ];

module.exports = config;

and

./webpack.middleware.config

const webpack = require('webpack');
let config = require('./webpack.base.config.js');

config.entry = [
  '@babel/polyfill',    
  './src/index.js',
  'webpack-hot-middleware/client?path=/__webpack_hmr&timeout=20000',        
];

config.plugins = [... [new webpack.HotModuleReplacementPlugin()], ... config.plugins ];

module.exports = config;

 what we just did:
Since we added config.plugins array in webpack.base.config we don’t want to override it here and lose the changes. That’s why we are merging the array using spread operator.
If you are unfamiliar with the spread operator you might read the link above. Basically what it does is to ‘spread’ two or more arrays into the current array.

var a = [1,2,3];

var b = [...a, ...[4,5,6]];

console.log(b);

result: (6) [1, 2, 3, 4, 5, 6]

Accessing env variables in the front-end.

And let’s load these variables. We could replace the hardcoded GraphQL url with the one from the .env file.

And using the variables on the back end is straight forward: we just include .env file and use the variables, but passing them to the front end requires a little bit more effort. We have to use the DefinePlugin which will allow us to create global constants which can be configured at compile time.

./src/components/App/index.js

import React, { Component } from 'react';
import PageLayout from '../../containers/PageLayout';
import { ApolloProvider } from 'react-apollo';
import { ApolloClient } from 'apollo-client';
import { HttpLink } from 'apollo-link-http';
import { InMemoryCache } from 'apollo-cache-inmemory';
import { BrowserRouter as Router, Route, Switch } from 'react-router-dom';
import { Provider } from 'react-redux';
import { createStore} from 'redux';
import reducers from '../../reducers';

import styles from './styles.scss';

const store = createStore(reducers, {});
export default class App extends Component {
  render() {
    const GRAPHQL_URL = process.env.GRAPHQL_URL;
    const client = new ApolloClient({
      link: new HttpLink({ uri:  GRAPHQL_URL }),
      cache: new InMemoryCache()
    });  
    return (
      <div className={styles.appWrapper}>
        <Provider store={store}>
          <ApolloProvider client={client}>
            <Router>
              <Switch>
              <Route exact path="*" component={PageLayout} />  
              </Switch>
            </Router>
          </ApolloProvider>
        </Provider>
      </div>        
    );
  }
}

 

And we could also print the APP name in the header section process.env.APP_NAME

./src/components/Header/index.js

import React from 'react';
import { Link } from 'react-router-dom';
const styles = require('./styles.scss');
const Header = ( {title} ) => (
  <div>
    <div className={styles.wrapper}>      
      <h2>{ title } { process.env.APP_NAME } </h2>
      <ul>
        <li><Link to='/home'>HOME</Link></li>
        <li><Link to='/greetings'>GREETINGS</Link></li>       
        <li><Link to='/dogs-catalog'>DOGS CATALOG</Link></li>
        <li><Link to='/about'>ABOUT</Link></li>
      </ul>
    </div>
  </div>
);
export default Header;

Now, start the server using yarn start-api and if everything works fine you will see the “Webpack React Tutorial” in the header.

 

branch-name:  
Click To Copy

 

 

Breaking the schema into separate group of files.

Having the entire schema into the server.js file is not practical, especially if the schema grows bigger, so let’s move the schema into separate folders for easy management:

mkdir ./src/schema/queries -p
mkdir ./src/schema/types -p
mkdir ./src/schema/mutations -p

./src/schema/index.js

const graphql = require('graphql');
const queries = require('./queries');
const mutations = require('./mutations');

var rootQuery = new graphql.GraphQLObjectType({
  name: 'Query',
  fields: {...queries, ...mutations },
});

module.exports = new graphql.GraphQLSchema({
  query: rootQuery
});

./src/schema/types/index.js

let Dogs = require('./dogs.js');

module.exports = {
  ...Dogs,
};

./src/schema/types/dogs.js

const graphql = require('graphql');

module.exports = new graphql.GraphQLObjectType({
  name: 'dogType',
  fields: {
    id: { type: graphql.GraphQLString },
    breed: { type: graphql.GraphQLString },
    displayImage: { type: graphql.GraphQLString },
  }
});

./src/schema/queries/index.js

let Dogs = require('./dogs.js');

module.exports = {
  ...Dogs
};

./src/schema/queries/dogs.js

const graphql = require('graphql');
const dogType = require('../types/dogs');
const dogs = require('../../models/mock_data/dogs.js');

module.exports = {
  getDogByBreed: {
    type: dogType,
    args: {
      breed: { type: graphql.GraphQLString }
    },
    resolve: function (_, {breed}) {
      var result = dogs.find(function(dog){
        return breed == dog.breed;
      });
      return result;
    }
  } 
}

./src/schema/mutations/index.js

let Dogs = require('./dogs.js');

module.exports = {
  ...Dogs
};

./src/schema/mutations/dogs.js

const graphql = require('graphql');
const dogs = require('../../models/mock_data/dogs.js');

module.exports = {
  addDogBreed: {
    type: graphql.GraphQLString,
    args: {
      id: { type: graphql.GraphQLString },
      breed: { type: graphql.GraphQLString },
      displayImage: { type: graphql.GraphQLString }
    },
    resolve: function (_, {id, breed, displayImage}) {
      dogs.push({
        id: id,
        breed: breed,
        displayImage: displayImage
      });
      return "OK!";
    }
  }   
}

./server.js

var express = require('express');
var graphqlHTTP = require('express-graphql');
const schema = require('./src/schema');


// Logger middleware
var logger = function(req, res, next) {
  console.log("GOT REQUEST >", req.ip);
  next(); // Passing the request to the next handler in the stack.
}

var app = express();
app.use(logger);
app.use('/graphql', graphqlHTTP({
  schema: schema,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');

 

Setting up faster development environment using Nodemon and Babel.

Table of Contents

Adding Nodemon

Nodemon is the same like node with the benefit that it will monitor the developing folder and restart automatically,, which makes the developing process more convenient. Let’s go ahead and install it:

yarn add nodemon --dev

And replace the node with nodemon in the start script:

./package.json

{
  "name": "graphql-tutorial",
  "version": "1.0.0",
  "main": "index.js",
  "license": "MIT",
  "scripts": {
    "start": "nodemon server.js"
  },
  "dependencies": {
    "express": "^4.16.4",
    "express-graphql": "^0.7.1",
    "graphql": "^14.0.2"
  },
  "devDependencies": {
    "nodemon": "^1.18.9"
  }
}

Now, whenever we edit and save a file Nodemon will conveniently restart for us and reflect the new changes.

If we also want to get advantage of the latest ES syntax we would like to install Babel.

 

Adding Babel 7

Starting with Babel7 the config setting changes quite a bit.

  1. Babels packages are now scoped and Babel has renamed it’s NPM packages. This means babel-cli for example has been renamed to @babel/cli .
  2. No messing around with presets anymore. You can just use @babel/preset-env now and optionally define your requirements in the config file.
  3. babel-node has been moved from the CLI to it’s own package: @babel/node

Let’s go ahead and install all necessary packages:

yarn add @babel/core @babel/cli @babel/node --dev

and tell nodemon to use Babel to transpile JS.

./package.json

{
  "name": "graphql-tutorial",
  "version": "1.0.0",
  "main": "index.js",
  "license": "MIT",
  "scripts": {
    "start": "nodemon --inspect=10111 --exec babel-node server.js"
  },
  "dependencies": {
    "express": "^4.16.4",
    "express-graphql": "^0.7.1",
    "graphql": "^14.0.2"
  },
  "devDependencies": {
    "nodemon": "^1.18.9"
  }
}

 

Before we could start using the new ES features we have to install a preset and tell Babel to use it by adding ./babelrc config file

yarn add @babel/preset-env --dev

add

./.babelrc

{
  "presets": [
    [
      "@babel/preset-env",
      {
        "useBuiltIns": "entry"
      }
    ]
  ]
}

 

give it a try …

./server.js

import express from 'express';
import graphqlHTTP from 'express-graphql';
import schema from './src/schema';


// Logger middleware
var logger = function(req, res, next) {
  console.log("GOT REQUEST >", req.ip);
  next(); // Passing the request to the next handler in the stack.
}

var app = express();
app.use(logger);
app.use('/graphql', graphqlHTTP({
  schema: schema,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');

 

Using GraphQLSchema to construct the schema programmatically.

For most of the cases defining a fixed schema when the application starts, by adding Query and Mutation types solely using schema language is good enough. But sometimes we might need to define a dynamic schema and we can achieve this by creating a new JS objects.

Construct dynamic schema for ‘User Greeting’ example.

Let’s get again to the ‘Greetings user’ example, because of it’s simplicity and define a Query with field named greetingUser which will accept userName and bornMonth parameters, first of type string and the second of type int and return userType.

And the userType will return greetingOne  which will simply say “Hello [userName]” and greetingTwo  that will let the user know how many months left till their next birthday. Both of type string.

./src/server.js

var express = require('express');
var graphqlHTTP = require('express-graphql');
var { buildSchema } = require('graphql');
var dogs = require('./src/models/mock_data/dogs.js');
const graphql = require('graphql');

// Define the User type
var userType = new graphql.GraphQLObjectType({
  name: 'UserType',
  fields: {
    greetingOne: { type: graphql.GraphQLString },
    greetingTwo: { type: graphql.GraphQLString },
  }
});

// Define the Query type
var queryType = new graphql.GraphQLObjectType({
  name: 'Query',
  fields: {
    greetingUser: {
      type: userType,
      args: {
        userName: { type: graphql.GraphQLString },
        bornMonth: { type: graphql.GraphQLInt }
      },
      resolve: function (_, {userName, bornMonth}) {
        var date = new Date();
        var daysLeft = bornMonth - (date.getMonth() + 1);
        daysLeft = daysLeft < 0 ? daysLeft + 12 : daysLeft;
        return {
          greetingOne: `Hello ${userName}`,
          greetingTwo: `Your birthday is comming in ${daysLeft} month(s)`
        };
      }
    }
  }
});

var schema = new graphql.GraphQLSchema({query: queryType});


// Logger middleware
var logger = function(req, res, next) {
  console.log("GOT REQUEST >", req.ip);
  next(); // Passing the request to the next handler in the stack.
}

var app = express();
app.use(logger);
app.use('/graphql', graphqlHTTP({
  schema: schema,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');

what we just did:
– we defined the user type which is pretty self explanatory (Lines 8-14)
– then we created the query type, that has these parameters:
type which is the return type, in this case userType
args is the input parameter types.
resolve is the resolver function.

Transform Dogs catalog to use dynamic schema.

Creating the dog type

var dogType = new graphql.GraphQLObjectType({
  name: 'dogType',
  fields: {
    id: { type: graphql.GraphQLString },
    breed: { type: graphql.GraphQLString },
    displayImage: { type: graphql.GraphQLString },
  }
});

Creating the query type

// Define the Query type
var queryType = new graphql.GraphQLObjectType({
  name: 'Query',
  fields: {
    getDogByBreed: {
      type: dogType,
      args: {
        breed: { type: graphql.GraphQLString }
      },
      resolve: function (_, {breed}) {

        var result = dogs.find(function(dog){
          return breed == dog.breed;
        });
        return result;
      }
    }
  }
});

Creating a mutation type

...
    addDogBreed: {
      type: graphql.GraphQLString,
      args: {
        id: { type: graphql.GraphQLString },
        breed: { type: graphql.GraphQLString },
        displayImage: { type: graphql.GraphQLString }
      },
      resolve: function (_, {id, breed, displayImage}) {
        dogs.push({
          id: id,
          breed: breed,
          displayImage: displayImage
        });
        return "OK!";
      }
    } 
...

Adding the query schema

...
var schema = new graphql.GraphQLSchema({query: queryType});

...
app.use('/graphql', graphqlHTTP({
  schema: schema,
  graphiql: true,
}));
...

Putting it all together

var express = require('express');
var graphqlHTTP = require('express-graphql');
var { buildSchema } = require('graphql');
var dogs = require('./src/models/mock_data/dogs.js');
const graphql = require('graphql');

// Define the dogs type
var dogType = new graphql.GraphQLObjectType({
  name: 'dogType',
  fields: {
    id: { type: graphql.GraphQLString },
    breed: { type: graphql.GraphQLString },
    displayImage: { type: graphql.GraphQLString },
  }
});

// Define the Query type
var queryType = new graphql.GraphQLObjectType({
  name: 'Query',
  fields: {
    getDogByBreed: {
      type: dogType,
      args: {
        breed: { type: graphql.GraphQLString }
      },
      resolve: function (_, {breed}) {

        var result = dogs.find(function(dog){
          return breed == dog.breed;
        });
        return result;
      }
    },
    addDogBreed: {
      type: graphql.GraphQLString,
      args: {
        id: { type: graphql.GraphQLString },
        breed: { type: graphql.GraphQLString },
        displayImage: { type: graphql.GraphQLString }
      },
      resolve: function (_, {id, breed, displayImage}) {
        dogs.push({
          id: id,
          breed: breed,
          displayImage: displayImage
        });
        return "OK!";
      }
    }    
  }
});


var schema = new graphql.GraphQLSchema({query: queryType});


// Logger middleware
var logger = function(req, res, next) {
  console.log("GOT REQUEST >", req.ip);
  next(); // Passing the request to the next handler in the stack.
}

var app = express();
app.use(logger);
app.use('/graphql', graphqlHTTP({
  schema: schema,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');

Adding Middleware to log requested IPs

Middleware is a good way to pre-process the request before handing it out to GraphQL.
We could add authentication that could check the auth token in the header or middleware that will log the request IPs. Let’s do the second one.

Adding middleware could be done the same way as we are doing it in Express app. Add the highlighted lines, and let’s see what they do.

var express = require('express');
var graphqlHTTP = require('express-graphql');
var { buildSchema } = require('graphql');
var dogs = require('./src/models/mock_data/dogs.js');
// Construct a schema, using GraphQL schema language
var schema = buildSchema(`
  type Mutation {
    setNewDog(id: String, breed: String, displayImage: String): String
  }  
  type dog {
    id: String,
    breed: String,
    displayImage: String
  }
  type dogsCatalog {
    getDogByBreed(breed: String): dog
  }
  type Query {
    queryDogs: dogsCatalog
  }  
`);
class dogsCatalog {
  
  getDogByBreed({breed}) {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    return result;
  }
}
// The root provides a resolver function for each API endpoint
var root = {
  queryDogs: () => {
    return new dogsCatalog();
  },
  setNewDog: ({id, breed, displayImage}) => {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    if(result != null) {
      return 'dog already exists!';
    }
    dogs.push({
      "breed": breed,
      "displayImage": displayImage,
      "id": id      
    });
    return 'dog added successfully !';
  }
};

// Logger middleware
var logger = function(req, res, next) {
  console.log("GOT REQUEST >", req.ip);
  next(); // Passing the request to the next handler in the stack.
}

var app = express();
app.use(logger);
app.use('/graphql', graphqlHTTP({
  schema: schema,
  rootValue: root,
  graphiql: true,
}));
app.listen(4001);
console.log('Running a GraphQL API server at localhost:4000/graphql');

 

 what we just did:
– we created the middleware object to log every request ip (lines 52-56)
– in the middleware we added the next() function to proceed to the next middleware none we are done with the request processing.
– we told Express to use our new middleware app.use(logger);

And as simple as that we created our own logging class.

Mutations and input types

 

Let’s scrap our user greeting and make really useful schema, that will show up a dog breed catalog. We are going to use it in another tutorial to create a React component to display the dog catalog.

Set up mock data.

Create a file under ./src/modules/mock_data that will serve as our data provider. Let’s add a couple of dog breeds there.

./src/modules/mock_data/dogs.js

module.exports = [
  {
    "breed": "affenpinscher",
    "displayImage": "https://images.dog.ceo/breeds/affenpinscher/n02110627_10225.jpg",
    "id": "Z1fdFgU"
  },
  {
    "breed": "african",
    "displayImage": "https://images.dog.ceo/breeds/african/n02116738_2515.jpg",
    "id": "Z1gPiBt"
  },
  {
    "breed": "airedale",
    "displayImage": "https://images.dog.ceo/breeds/airedale/n02096051_6335.jpg",
    "id": "ZNDtCU"
  },
  {
    "breed": "akita",
    "displayImage": "https://images.dog.ceo/breeds/akita/Japaneseakita.jpg",
    "id": "6HalQ"
  },
  {
    "breed": "appenzeller",
    "displayImage": "https://images.dog.ceo/breeds/appenzeller/n02107908_3311.jpg",
    "id": "Z1tlucN"
  },
  {
    "breed": "basenji",
    "displayImage": "https://images.dog.ceo/breeds/basenji/n02110806_3415.jpg",
    "id": "Zo1k5y"
  }  
];

Writing a query to fetch the mock data.

Let’s first write a query to fetch dog’s details based on the breed.

Based on the JSON data above, dog’s schema would look like this:

...
  type dog {
    id: String,
    breed: String,
    displayImage: String
  }
  type dogsCatalog {
    getDogByBreed(breed: String): dog
  }
  type Query {
    queryDogs: dogsCatalog
  } 
...

We created a dogCatalog query (line 7) that accepts the dog breed, and returns a dog type declared in the beginning of the sniped above.

Then the resolver would look like this:

...
class dogsCatalog {
  
  getDogByBreed({breed}) {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    return result;
  }
}
...

 

Write the mutation.

The mutations are used to modify data. In GraphQL they are similar as queries and could be written in a similar way.

The basic idea.

...
  type Mutation {
    setNewDog(id: String, breed: String, displayImage: String): String
  }  
...

here, we defined mutation named setNewDog which simply takes a few parameters (id, breed and displayImage) and returns the status message.

The resolver will look like this:

...
  setNewDog: ({id, breed, displayImage}) => {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    if(result != null) {
      return 'dog allready exists!';
    }

    dogs.push({
      "breed": breed,
      "displayImage": displayImage,
      "id": id      
    });
    return 'dog added successfuly !';
  }
...

Let’s put them all together:

./src/server.js

var express = require('express');
var graphqlHTTP = require('express-graphql');
var { buildSchema } = require('graphql');
var dogs = require('./src/models/mock_data/dogs.js');

// Construct a schema, using GraphQL schema language

var schema = buildSchema(`


  type Mutation {
    setNewDog(id: String, breed: String, displayImage: String): String
  }  

  type dog {
    id: String,
    breed: String,
    displayImage: String
  }

  type dogsCatalog {
    getDogByBreed(breed: String): dog
  }

  type Query {
    queryDogs: dogsCatalog
  }  
`);

class dogsCatalog {
  
  getDogByBreed({breed}) {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    return result;
  }
}

// The root provides a resolver function for each API endpoint
var root = {
  queryDogs: () => {
    return new dogsCatalog();
  },
  setNewDog: ({id, breed, displayImage}) => {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    if(result != null) {
      return 'dog already exists!';
    }

    dogs.push({
      "breed": breed,
      "displayImage": displayImage,
      "id": id      
    });
    return 'dog added successfully !';
  }
};

var app = express();
app.use('/graphql', graphqlHTTP({
  schema: schema,
  rootValue: root,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');

Test Using GraphQL UI

Writing a query to fetch existing data.

Now navigate to GraphQL UI http://localhost:4001/graphql and add this in the query window (up left)

query queryLabel($breed: String!) {
  queryDogs {
    getDogByBreed(breed: $breed) {
      id
      displayImage,
      breed
    }
  }
}

and this in the variables window (down left)

{
  "breed": "african"
}

hit play and the result should be this:

{
  "data": {
    "queryDogs": {
      "getDogByBreed": {
        "id": "Z1gPiBt",
        "displayImage": "https://images.dog.ceo/breeds/african/n02116738_2515.jpg",
        "breed": "african"
      }
    }
  }
}

Now change the breed in the variables window (down left) to ‘labrador’ and hit play again, and the result is null since there is no record for this breed.

Write a mutation to add a new record to data structure.

Open a new QraphQL UI tab and add the following mutation in the query window:

mutation setNewDogLabel($id: String, $breed: String, $displayImage: String) {
  setNewDog(id: $id, breed: $breed, displayImage: $displayImage)
}

and this in the variables window:

{
  "id": "Z2iEkxg",
  "breed": "labrador",
  "displayImage": "https://images.dog.ceo/breeds/labrador/n02099712_2120.jpg"
}

Hit play and if everything went well you will see the success message.

{
  "data": {
    "setNewDog": "dog added successfuly !"
  }
}

(Out of curiosity you might want to hit play one more time to see what message you will get)

Now, navigate back to the previous GraphQL UI tab and hit play again and you will see newly created record.

 

 

 

 

var express = require('express');
var graphqlHTTP = require('express-graphql');
var { buildSchema } = require('graphql');
var dogs = require('./src/models/mock_data/dogs.js');

// Construct a schema, using GraphQL schema language

var schema = buildSchema(`

  input DogInput {
    id: String,
    breed: String,
    displayImage: String
  }

  type Mutation {
    setNewDog(input: DogInput): String
  }  

  type dog {
    id: String,
    breed: String,
    displayImage: String
  }

  type dogsCatalog {
    getDogByBreed(breed: String): dog
  }

  type Query {
    queryDogs: dogsCatalog
  }  
`);

class dogsCatalog {
  
  getDogByBreed({breed}) {
    var result = dogs.find(function(dog){
      return breed == dog.breed ? dog : null;
    });
    return result;
  }
}

// The root provides a resolver function for each API endpoint
var root = {
  queryDogs: () => {
    return new dogsCatalog();
  },
  setNewDog: ({input}) => {
    var result = dogs.find(function(dog){
      return input.breed == dog.breed ? dog : null;
    });
    if(result != null) {
      return 'dog allready exists!';
    }

    dogs.push({
      "breed": input.breed,
      "displayImage": input.displayImage,
      "id": input.id      
    });
    return 'dog added successfuly !';
  }
};

var app = express();
app.use('/graphql', graphqlHTTP({
  schema: schema,
  rootValue: root,
  graphiql: true,
}));
app.listen(4000);
console.log('Running a GraphQL API server at localhost:4000/graphql');