15th August 2016 11:18 pm
The Raspberry Pi is a great device for running simple web apps at home on a permanent basis, and you can pick up a small touchscreen for it quite cheaply. This makes it easy to build and host a small personal dashboard that pulls important data from various APIs or RSS feeds and displays it. You’ll often see dashboards like this on Raspberry Pi forums and subreddits. As I’m currently between jobs, and have some time to spare before my new job starts, I decided to start creating my own version of it. It was obvious that React.js is a good fit for this as it allows you to break up your user interface into multiple independent components and keep the functionality close to the UI. It also makes it easy to reuse widgets by passing different parameters through each time.
In this tutorial I’ll show you how to start building a simple personal dashboard using React and Webpack. You can then install Nginx on your Raspberry Pi and host it from there. In the process, you’ll be able to pick up a bit of knowledge about Webpack and ECMAScript 2015 (using Babel). Our initial implementation will have only two widgets, a clock and a feed, but those should show you enough of the basics that you should then be able to build other widgets you may have in mind.
Installing our dependencies
First, let’s create our package.json
:
$ npm init -y
Then install the dependencies:
$ npm install --save-dev babel-cli babel-register babel-core babel-eslint babel-loader babel-preset-es2015 babel-preset-react chai css-loader eslint eslint-loader eslint-plugin-react file-loader istanbul@^1.0.0-alpha.2 jquery jsdom mocha moment node-sass react react-addons-pure-render-mixin react-addons-test-utils react-dom react-hot-loader request sass-loader style-loader url-loader webpack webpack-dev-server
Note that we need to install a specific version of Istanbul to get code coverage.
Next, we create our Webpack config. Save this as webpack.config.js
:
| var webpack = require('webpack'); |
| module.exports = { |
| entry: [ |
| 'webpack/hot/only-dev-server', |
| "./js/app.js" |
| ], |
| debug: true, |
| devtool: 'source-map', |
| output: { |
| path: __dirname + '/static', |
| filename: "bundle.js" |
| }, |
| module: { |
| preLoaders: [ |
| { |
| test: /(\.js$|\.jsx$)/, |
| exclude: /node_modules/, |
| loader: "eslint-loader" |
| } |
| ], |
| loaders: [ |
| { test: /\.jsx?$/, loaders: ['react-hot', 'babel'], exclude: /node_modules/ }, |
| { test: /\.js$/, exclude: /node_modules/, loader: 'babel-loader'}, |
| { test: /\.woff2?$/, loader: "url-loader?limit=25000" }, |
| { test: /\.(eot|svg|ttf)?$/, loader: "file-loader" }, |
| { test: /\.scss$/, loader: "style!css!sass" } |
| ] |
| }, |
| eslint: { |
| configFile: '.eslintrc.yml' |
| }, |
| plugins: [ |
| new webpack.HotModuleReplacementPlugin(), |
| new webpack.NoErrorsPlugin() |
| ] |
| }; |
Note the various loaders we’re using. We use ESLint to lint our Javascript files for code quality, and the build will fail if they do not match the required standards. We’re also using loaders for CSS, Sass, Babel (so we can use ES2015 for our Javascript) and fonts. Also, note the hot module replacement plugin - this allows us to reload the application automatically. If you haven’t used Webpack before, this config should be sufficient to get you started, but I recommend reading the documentation.
We also need to configure ESLint how we want. Here is the configuration we will be using, which should be saved as .eslintrc.yml
:
| rules: |
| no-debugger: |
| - 0 |
| no-console: |
| - 0 |
| no-unused-vars: |
| - 0 |
| indent: |
| - 2 |
| - 2 |
| quotes: |
| - 2 |
| - single |
| linebreak-style: |
| - 2 |
| - unix |
| semi: |
| - 2 |
| - always |
| env: |
| es6: true |
| browser: true |
| node: true |
| extends: 'eslint:recommended' |
| parserOptions: |
| sourceType: module |
| ecmaFeatures: |
| jsx: true |
| experimentalObjectRestSpread: true |
| modules: true |
| plugins: |
| - react |
We also need a base HTML file. Save this as index.html
:
| <!doctype html> |
| <html lang="en"> |
| <head> |
| <meta charset="utf-8"> |
| <title>Personal Dashboard</title> |
| </head> |
| <body> |
| <div id="view"></section> |
| <script src="bundle.js"></script> |
| </body> |
| </html> |
We also need to set the commands for building and testing our app in package.json
:
| "scripts": { |
| "test": "istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)'", |
| "test:watch": "npm run test -- --watch", |
| "start": "webpack-dev-server --progress --colors", |
| "build": "webpack --progress --colors" |
| }, |
| "babel": { |
| "presets": [ |
| "es2015", |
| "react" |
| ] |
| }, |
The npm test
command will call Mocha to run the tests, but will also use Istanbul to generate test coverage. For the sake of brevity, our tests won’t be terribly comprehensive. The npm start
command will run a development server, while npm run build
will build our application.
We also need to create the test/
folder and the test/setup.js
file:
| import jsdom from 'jsdom'; |
| import chai from 'chai'; |
| |
| const doc = jsdom.jsdom('<!doctype html><html><body></body></html>'); |
| const win = doc.defaultView; |
| |
| global.document = doc; |
| global.window = win; |
| |
| Object.keys(window).forEach((key) => { |
| if (!(key in global)) { |
| global[key] = window[key]; |
| } |
| }); |
This sets up Chai and creates a dummy DOM for our tests. We also need to create the folder js/
and the file js/app.js
. You can leave that file empty for now.
If you now run npm start
and navigate to http://localhost:8080/webpack-dev-server/, you can see the current state of the application.
Our dashboard component
Our first React component will be a wrapper for all the other ones. Each of the rest of the components will be a self-contained widget that will populate itself without the need for a centralised data store like Redux. I will mention that Redux is a very useful library, and for larger React applications it makes a lot of sense to use it, but here we’re better off having each widget manage its own data internally, rather than have it be passed down from a single data store.
Save the following as test/components/dashboard.js
:
| import TestUtils from 'react-addons-test-utils'; |
| import React from 'react'; |
| import {findDOMNode} from 'react-dom'; |
| import Dashboard from '../../js/components/dashboard'; |
| import {expect} from 'chai'; |
| |
| const {renderIntoDocument, scryRenderedDOMComponentsWithClass, Simulate} = TestUtils; |
| |
| describe('Dashboard', () => { |
| it('renders the dashboard', () => { |
| const component = renderIntoDocument( |
| <Dashboard title="My Dashboard" /> |
| ); |
| const title = findDOMNode(component.refs.title); |
| expect(title).to.be.ok; |
| expect(title.textContent).to.contain('My Dashboard'); |
| }); |
| } |
This tests that we can set the title of our dashboard component. Let’s run our tests:
| $ npm test |
| |
| > personal-dashboard@1.0.0 test /home/matthew/Projects/personal-dashboard |
| > istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)' |
| |
| No coverage information was collected, exit without writing coverage information |
| module.js:327 |
| throw err; |
| ^ |
| |
| Error: Cannot find module '../../js/components/dashboard' |
| at Function.Module._resolveFilename (module.js:325:15) |
| at Function.Module._load (module.js:276:25) |
| at Module.require (module.js:353:17) |
| at require (internal/module.js:12:17) |
| at Object.<anonymous> (dashboard.js:4:1) |
| at Module._compile (module.js:409:26) |
| at loader (/home/matthew/Projects/personal-dashboard/node_modules/babel-register/lib/node.js:148:5) |
| at Object.require.extensions.(anonymous function) [as .js] (/home/matthew/Projects/personal-dashboard/node_modules/babel-register/lib/node.js:158:7) |
| at Module.load (module.js:343:32) |
| at Function.Module._load (module.js:300:12) |
| at Module.require (module.js:353:17) |
| at require (internal/module.js:12:17) |
| at /home/matthew/Projects/personal-dashboard/node_modules/mocha/lib/mocha.js:220:27 |
| at Array.forEach (native) |
| at Mocha.loadFiles (/home/matthew/Projects/personal-dashboard/node_modules/mocha/lib/mocha.js:217:14) |
| at Mocha.run (/home/matthew/Projects/personal-dashboard/node_modules/mocha/lib/mocha.js:485:10) |
| at Object.<anonymous> (/home/matthew/Projects/personal-dashboard/node_modules/mocha/bin/_mocha:403:18) |
| at Module._compile (module.js:409:26) |
| at Object.Module._extensions..js (module.js:416:10) |
| at Object.Module._extensions.(anonymous function) (/home/matthew/Projects/personal-dashboard/node_modules/istanbul/lib/hook.js:109:37) |
| at Module.load (module.js:343:32) |
| at Function.Module._load (module.js:300:12) |
| at Function.Module.runMain (module.js:441:10) |
| at runFn (/home/matthew/Projects/personal-dashboard/node_modules/istanbul/lib/command/common/run-with-cover.js:122:16) |
| at /home/matthew/Projects/personal-dashboard/node_modules/istanbul/lib/command/common/run-with-cover.js:251:17 |
| at /home/matthew/Projects/personal-dashboard/node_modules/istanbul/lib/util/file-matcher.js:68:16 |
| at /home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:52:16 |
| at /home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:361:13 |
| at /home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:52:16 |
| at done (/home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:246:17) |
| at /home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:44:16 |
| at /home/matthew/Projects/personal-dashboard/node_modules/async/lib/async.js:358:17 |
| at LOOP (fs.js:1530:14) |
| at nextTickCallbackWith0Args (node.js:420:9) |
| at process._tickCallback (node.js:349:13) |
| npm ERR! Test failed. See above for more details. |
Our dashboard file doesn’t exist. So let’s create it:
| $ mkdir js/components |
| $ touch js/components/dashboard.js |
And run our test again:
| $ npm test |
| |
| > personal-dashboard@1.0.0 test /home/matthew/Projects/personal-dashboard |
| > istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)' |
| |
| |
| |
| Dashboard |
| Warning: React.createElement: type should not be null, undefined, boolean, or number. It should be a string (for DOM elements) or a ReactClass (for composite components). |
| 1) renders the dashboard |
| |
| |
| 0 passing (31ms) |
| 1 failing |
| |
| 1) Dashboard renders the dashboard: |
| Invariant Violation: Element type is invalid: expected a string (for built-in components) or a class/function (for composite components) but got: object. |
| at invariant (node_modules/fbjs/lib/invariant.js:38:15) |
| at [object Object].instantiateReactComponent [as _instantiateReactComponent] (node_modules/react/lib/instantiateReactComponent.js:86:134) |
| at [object Object].ReactCompositeComponentMixin.performInitialMount (node_modules/react/lib/ReactCompositeComponent.js:388:22) |
| at [object Object].ReactCompositeComponentMixin.mountComponent (node_modules/react/lib/ReactCompositeComponent.js:262:21) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at mountComponentIntoNode (node_modules/react/lib/ReactMount.js:105:32) |
| at ReactReconcileTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at batchedMountComponentIntoNode (node_modules/react/lib/ReactMount.js:126:15) |
| at ReactDefaultBatchingStrategyTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at Object.ReactDefaultBatchingStrategy.batchedUpdates (node_modules/react/lib/ReactDefaultBatchingStrategy.js:63:19) |
| at Object.batchedUpdates (node_modules/react/lib/ReactUpdates.js:98:20) |
| at Object.ReactMount._renderNewRootComponent (node_modules/react/lib/ReactMount.js:285:18) |
| at Object.ReactMount._renderSubtreeIntoContainer (node_modules/react/lib/ReactMount.js:371:32) |
| at Object.ReactMount.render (node_modules/react/lib/ReactMount.js:392:23) |
| at ReactTestUtils.renderIntoDocument (node_modules/react/lib/ReactTestUtils.js:85:21) |
| at Context.<anonymous> (dashboard.js:11:23) |
| |
| |
| |
| No coverage information was collected, exit without writing coverage information |
| npm ERR! Test failed. See above for more details. |
Now we have a failing test, we can create our component. Save this as js/components/dashboard.js
:
| import React from 'react'; |
| |
| export default React.createClass({ |
| render() { |
| return ( |
| <div className="dashboard"> |
| <h1 ref="title">{this.props.title}</h1> |
| <div className="wrapper"> |
| </div> |
| </div> |
| ); |
| } |
| }); |
And let’s run our tests again:
| $ npm test |
| |
| > personal-dashboard@1.0.0 test /home/matthew/Projects/personal-dashboard |
| > istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)' |
| |
| |
| |
| Dashboard |
| ✓ renders the dashboard |
| |
| |
| 1 passing (50ms) |
| |
| No coverage information was collected, exit without writing coverage information |
Our first component is in place. However, it isn’t getting loaded. We also need to start thinking about styling. Create the file scss/style.scss
, but leave it blank for now. Then save this in js/app.js
:
| import React from 'react'; |
| import ReactDOM from 'react-dom'; |
| import Dashboard from './components/dashboard'; |
| import styles from '../scss/style.scss'; |
| |
| ReactDOM.render( |
| <Dashboard title="My Dashboard" />, |
| document.getElementById('view') |
| ); |
Note that we’re importing CSS or Sass files in the same way as Javascript files. This is unique to Webpack, and while it takes a bit of getting used to, it has its advantages - if you import only the styles relating to each component, you can be sure there’s no orphaned CSS files. Here, we only have one CSS file anyway, so it’s a non-issue.
If you now run npm start
, our dashboard gets loaded and the title is displayed. With our dashboard in place, we can now implement our first widget.
Our first widget will be a simple clock. This demonstrates changing the state of the widget on an interval. First let’s write a test - save this as test/components/clockwidget.js
:
| import TestUtils from 'react-addons-test-utils'; |
| import React from 'react'; |
| import {findDOMNode} from 'react-dom'; |
| import ClockWidget from '../../js/components/clockwidget'; |
| import {expect} from 'chai'; |
| |
| const {renderIntoDocument, scryRenderedDOMComponentsWithClass, Simulate} = TestUtils; |
| |
| describe('Clock Widget', () => { |
| it('renders the clock widget', () => { |
| const currentTime = 1465160300530; |
| const component = renderIntoDocument( |
| <ClockWidget time={currentTime} /> |
| ); |
| const time = findDOMNode(component.refs.time); |
| expect(time).to.be.ok; |
| expect(time.textContent).to.contain('Sunday'); |
| }); |
| }); |
And create an empty file at js/components/clockwidget.js
. Then we run our tests again:
| $ npm test |
| |
| > personal-dashboard@1.0.0 test /home/matthew/Projects/personal-dashboard |
| > istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)' |
| |
| |
| |
| Clock Widget |
| Warning: React.createElement: type should not be null, undefined, boolean, or number. It should be a string (for DOM elements) or a ReactClass (for composite components). |
| 1) renders the clock widget |
| |
| Dashboard |
| ✓ renders the dashboard |
| |
| |
| 1 passing (46ms) |
| 1 failing |
| |
| 1) Clock Widget renders the clock widget: |
| Invariant Violation: Element type is invalid: expected a string (for built-in components) or a class/function (for composite components) but got: object. |
| at invariant (node_modules/fbjs/lib/invariant.js:38:15) |
| at [object Object].instantiateReactComponent [as _instantiateReactComponent] (node_modules/react/lib/instantiateReactComponent.js:86:134) |
| at [object Object].ReactCompositeComponentMixin.performInitialMount (node_modules/react/lib/ReactCompositeComponent.js:388:22) |
| at [object Object].ReactCompositeComponentMixin.mountComponent (node_modules/react/lib/ReactCompositeComponent.js:262:21) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at mountComponentIntoNode (node_modules/react/lib/ReactMount.js:105:32) |
| at ReactReconcileTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at batchedMountComponentIntoNode (node_modules/react/lib/ReactMount.js:126:15) |
| at ReactDefaultBatchingStrategyTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at Object.ReactDefaultBatchingStrategy.batchedUpdates (node_modules/react/lib/ReactDefaultBatchingStrategy.js:63:19) |
| at Object.batchedUpdates (node_modules/react/lib/ReactUpdates.js:98:20) |
| at Object.ReactMount._renderNewRootComponent (node_modules/react/lib/ReactMount.js:285:18) |
| at Object.ReactMount._renderSubtreeIntoContainer (node_modules/react/lib/ReactMount.js:371:32) |
| at Object.ReactMount.render (node_modules/react/lib/ReactMount.js:392:23) |
| at ReactTestUtils.renderIntoDocument (node_modules/react/lib/ReactTestUtils.js:85:21) |
| at Context.<anonymous> (clockwidget.js:12:23) |
| |
| |
| |
| No coverage information was collected, exit without writing coverage information |
| npm ERR! Test failed. See above for more details. |
With a failing test in place, we can create our component:
| import React from 'react'; |
| import moment from 'moment'; |
| |
| export default React.createClass({ |
| getInitialState() { |
| return { |
| time: this.props.time || moment() |
| }; |
| }, |
| render() { |
| const time = moment(this.state.time).format('dddd, Do MMMM YYYY, h:mm:ss a'); |
| return ( |
| <div className="clockwidget widget"> |
| <div className="widget-content"> |
| <h2 ref="time">{time}</h2> |
| </div> |
| </div> |
| ); |
| } |
| }); |
Note that the component accepts a property of time
. The getInitialState()
method then converts this.props.time
into this.state.time
so that it can be displayed on render. Note we also set a default of the current time using Moment.js.
We also need to update the dashboard component to load this new component:
| import React from 'react'; |
| import ClockWidget from './clockwidget'; |
| |
| export default React.createClass({ |
| render() { |
| return ( |
| <div className="dashboard"> |
| <h1 ref="title">{this.props.title}</h1> |
| <div className="wrapper"> |
| <ClockWidget /> |
| </div> |
| </div> |
| ); |
| } |
| }); |
Now, if you try running npm start
and viewing the dashboard in the browser, you will see that it displays the current time and date, but it’s not being updated. You can force the page to reload every now and then, but we can do better than that. We can set an interval in which the time will refresh. As the smallest unit we show is seconds, this interval should be 1 second.
Amend the clock component as follows:
| import React from 'react'; |
| import moment from 'moment'; |
| |
| export default React.createClass({ |
| getInitialState() { |
| return { |
| time: this.props.time || moment() |
| }; |
| }, |
| tick() { |
| this.setState({ |
| time: moment() |
| }); |
| }, |
| componentDidMount() { |
| this.interval = setInterval(this.tick, 1000); |
| }, |
| componentWillUnmount() { |
| clearInterval(this.interval); |
| }, |
| render() { |
| const time = moment(this.state.time).format('dddd, Do MMMM YYYY, h:mm:ss a'); |
| return ( |
| <div className="clockwidget widget"> |
| <div className="widget-content"> |
| <h2 ref="time">{time}</h2> |
| </div> |
| </div> |
| ); |
| } |
| }); |
When our component has mounted, we set an interval of 1,000 milliseconds, and each time it elapses we call the tick()
method. This method sets the state to the current time, and as a result the user interface is automatically re-rendered. On unmount, we clear the interval.
In this case we’re just calling a single function on a set interval. In principle, the same approach can be used to populate components in other ways, such as by making an AJAX request.
Our next widget will be a simple RSS feed reader. We’ll fetch the content with jQuery and render it using React. We’ll also reload it regularly. First, let’s create our test:
| import TestUtils from 'react-addons-test-utils'; |
| import React from 'react'; |
| import {findDOMNode} from 'react-dom'; |
| import FeedWidget from '../../js/components/feedwidget'; |
| import {expect} from 'chai'; |
| |
| const {renderIntoDocument, scryRenderedDOMComponentsWithClass, Simulate} = TestUtils; |
| |
| describe('Feed Widget', () => { |
| it('renders the Feed widget', () => { |
| const url = "http://feeds.bbci.co.uk/news/rss.xml?edition=uk" |
| const component = renderIntoDocument( |
| <FeedWidget feed={url} size={5} delay={60} /> |
| ); |
| const feed = findDOMNode(component.refs.feed); |
| expect(feed).to.be.ok; |
| expect(feed.textContent).to.contain(url); |
| }); |
| }); |
Our feed widget will accept an external URL as an argument, and will then poll this URL regularly to populate the feed. It also allows us to specify the size
attribute, which denotes the number of feed items, and the delay
attribute, which denotes the number of seconds it should wait before fetching the data again.
We also need to amend the dashboard component to include this widget:
| import React from 'react'; |
| import ClockWidget from './clockwidget'; |
| import FeedWidget from './feedwidget'; |
| |
| export default React.createClass({ |
| render() { |
| return ( |
| <div className="dashboard"> |
| <h1 ref="title">{this.props.title}</h1> |
| <div className="wrapper"> |
| <ClockWidget /> |
| <FeedWidget feed="http://feeds.bbci.co.uk/news/rss.xml?edition=uk" size="5" delay="60" /> |
| </div> |
| </div> |
| ); |
| } |
| }); |
If we then create js/components/feedwidget.js
and run npm test
:
| $ npm test |
| |
| > personal-dashboard@1.0.0 test /home/matthew/Projects/personal-dashboard |
| > istanbul cover _mocha -- --compilers js:babel-core/register --require ./test/setup.js 'test/**/*.@(js|jsx)' |
| |
| |
| |
| Clock Widget |
| ✓ renders the clock widget (92ms) |
| |
| Dashboard |
| Warning: React.createElement: type should not be null, undefined, boolean, or number. It should be a string (for DOM elements) or a ReactClass (for composite components). Check the render method of `dashboard`. |
| 1) renders the dashboard |
| |
| Feed Widget |
| Warning: React.createElement: type should not be null, undefined, boolean, or number. It should be a string (for DOM elements) or a ReactClass (for composite components). |
| 2) renders the Feed widget |
| |
| |
| 1 passing (286ms) |
| 2 failing |
| |
| 1) Dashboard renders the dashboard: |
| Invariant Violation: Element type is invalid: expected a string (for built-in components) or a class/function (for composite components) but got: object. Check the render method of `dashboard`. |
| at invariant (node_modules/fbjs/lib/invariant.js:38:15) |
| at instantiateReactComponent (node_modules/react/lib/instantiateReactComponent.js:86:134) |
| at instantiateChild (node_modules/react/lib/ReactChildReconciler.js:43:28) |
| at node_modules/react/lib/ReactChildReconciler.js:70:16 |
| at traverseAllChildrenImpl (node_modules/react/lib/traverseAllChildren.js:69:5) |
| at traverseAllChildrenImpl (node_modules/react/lib/traverseAllChildren.js:85:23) |
| at traverseAllChildren (node_modules/react/lib/traverseAllChildren.js:164:10) |
| at Object.ReactChildReconciler.instantiateChildren (node_modules/react/lib/ReactChildReconciler.js:69:7) |
| at ReactDOMComponent.ReactMultiChild.Mixin._reconcilerInstantiateChildren (node_modules/react/lib/ReactMultiChild.js:194:41) |
| at ReactDOMComponent.ReactMultiChild.Mixin.mountChildren (node_modules/react/lib/ReactMultiChild.js:231:27) |
| at ReactDOMComponent.Mixin._createInitialChildren (node_modules/react/lib/ReactDOMComponent.js:715:32) |
| at ReactDOMComponent.Mixin.mountComponent (node_modules/react/lib/ReactDOMComponent.js:531:12) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at ReactDOMComponent.ReactMultiChild.Mixin.mountChildren (node_modules/react/lib/ReactMultiChild.js:242:44) |
| at ReactDOMComponent.Mixin._createInitialChildren (node_modules/react/lib/ReactDOMComponent.js:715:32) |
| at ReactDOMComponent.Mixin.mountComponent (node_modules/react/lib/ReactDOMComponent.js:531:12) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at [object Object].ReactCompositeComponentMixin.performInitialMount (node_modules/react/lib/ReactCompositeComponent.js:397:34) |
| at [object Object].ReactCompositeComponentMixin.mountComponent (node_modules/react/lib/ReactCompositeComponent.js:262:21) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at [object Object].ReactCompositeComponentMixin.performInitialMount (node_modules/react/lib/ReactCompositeComponent.js:397:34) |
| at [object Object].ReactCompositeComponentMixin.mountComponent (node_modules/react/lib/ReactCompositeComponent.js:262:21) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at mountComponentIntoNode (node_modules/react/lib/ReactMount.js:105:32) |
| at ReactReconcileTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at batchedMountComponentIntoNode (node_modules/react/lib/ReactMount.js:126:15) |
| at ReactDefaultBatchingStrategyTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at Object.ReactDefaultBatchingStrategy.batchedUpdates (node_modules/react/lib/ReactDefaultBatchingStrategy.js:63:19) |
| at Object.batchedUpdates (node_modules/react/lib/ReactUpdates.js:98:20) |
| at Object.ReactMount._renderNewRootComponent (node_modules/react/lib/ReactMount.js:285:18) |
| at Object.ReactMount._renderSubtreeIntoContainer (node_modules/react/lib/ReactMount.js:371:32) |
| at Object.ReactMount.render (node_modules/react/lib/ReactMount.js:392:23) |
| at ReactTestUtils.renderIntoDocument (node_modules/react/lib/ReactTestUtils.js:85:21) |
| at Context.<anonymous> (dashboard.js:11:23) |
| |
| 2) Feed Widget renders the Feed widget: |
| Invariant Violation: Element type is invalid: expected a string (for built-in components) or a class/function (for composite components) but got: object. |
| at invariant (node_modules/fbjs/lib/invariant.js:38:15) |
| at [object Object].instantiateReactComponent [as _instantiateReactComponent] (node_modules/react/lib/instantiateReactComponent.js:86:134) |
| at [object Object].ReactCompositeComponentMixin.performInitialMount (node_modules/react/lib/ReactCompositeComponent.js:388:22) |
| at [object Object].ReactCompositeComponentMixin.mountComponent (node_modules/react/lib/ReactCompositeComponent.js:262:21) |
| at Object.ReactReconciler.mountComponent (node_modules/react/lib/ReactReconciler.js:47:35) |
| at mountComponentIntoNode (node_modules/react/lib/ReactMount.js:105:32) |
| at ReactReconcileTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at batchedMountComponentIntoNode (node_modules/react/lib/ReactMount.js:126:15) |
| at ReactDefaultBatchingStrategyTransaction.Mixin.perform (node_modules/react/lib/Transaction.js:138:20) |
| at Object.ReactDefaultBatchingStrategy.batchedUpdates (node_modules/react/lib/ReactDefaultBatchingStrategy.js:63:19) |
| at Object.batchedUpdates (node_modules/react/lib/ReactUpdates.js:98:20) |
| at Object.ReactMount._renderNewRootComponent (node_modules/react/lib/ReactMount.js:285:18) |
| at Object.ReactMount._renderSubtreeIntoContainer (node_modules/react/lib/ReactMount.js:371:32) |
| at Object.ReactMount.render (node_modules/react/lib/ReactMount.js:392:23) |
| at ReactTestUtils.renderIntoDocument (node_modules/react/lib/ReactTestUtils.js:85:21) |
| at Context.<anonymous> (feedwidget.js:12:23) |
| |
| |
| |
| |
| =============================== Coverage summary =============================== |
| Statements : 83.33% ( 10/12 ) |
| Branches : 50% ( 1/2 ) |
| Functions : 66.67% ( 4/6 ) |
| Lines : 83.33% ( 10/12 ) |
| ================================================================================ |
| npm ERR! Test failed. See above for more details. |
Our test fails, so we can start work on the widget proper. Here it is:
| import React from 'react'; |
| import jQuery from 'jquery'; |
| window.jQuery = jQuery; |
| |
| const FeedItem = React.createClass({ |
| render() { |
| return ( |
| <a href={this.props.link} target="_blank"> |
| <li className="feeditem">{this.props.title}</li> |
| </a> |
| ); |
| } |
| }); |
| |
| export default React.createClass({ |
| getInitialState() { |
| return { |
| feed: [], |
| size: this.props.size || 5 |
| }; |
| }, |
| componentDidMount() { |
| this.getFeed(); |
| this.interval = setInterval(this.getFeed, (this.props.delay * 1000)); |
| }, |
| componentWillUnmount() { |
| clearInterval(this.interval); |
| }, |
| getFeed() { |
| let that = this; |
| jQuery.ajax({ |
| url: this.props.feed, |
| success: function (response) { |
| let xml = jQuery(response); |
| let feed = []; |
| xml.find('item').each(function () { |
| let item = {}; |
| item.title = jQuery(this).find('title').text(); |
| item.link = jQuery(this).find('guid').text(); |
| feed.push(item); |
| }); |
| that.setState({ |
| feed: feed.slice(0,that.state.size) |
| }); |
| } |
| }); |
| }, |
| render() { |
| let feedItems = this.state.feed.map(function (item, index) { |
| return ( |
| <FeedItem title={item.title} link={item.link} key={item.link}></FeedItem> |
| ); |
| }); |
| return ( |
| <div className="feedwidget widget"> |
| <div className="widget-content"> |
| <h2 ref="feed"> Fetched from {this.props.feed}</h2> |
| <ul> |
| {feedItems} |
| </ul> |
| </div> |
| </div> |
| ); |
| } |
| }); |
This is by far the most complex component, so a little explanation is called for. We include jQuery as a dependency at the top of the file. Then we create a component for rendering an individual feed item, called FeedItem
. This is very simple, consisting of an anchor tag wrapped around a list item. Note the use of the const
keyword - in ES6 this denotes a constant.
Next, we move onto the feed widget proper. We set the initial state of the feed to be an empty array. Then, we define a componentDidMount()
method that calls getFeed()
and sets up an interval to call it again, based on the delay
property. The getFeed()
method fetches the URL in question and sets this.state.feed
to an array of the most recent entries in the feed, with the size denoted by the size
property passed through. We also clear that interval when the component is about to be unmounted.
Note that you may have problems with the Access-Control-Allow-Origin
HTTP header. It’s possible to disable this in your web browser, so if you want to run this as a dashboard you’ll probably need to do so. On Chrome there’s a useful plugin that allows you to disable this when needed.
Because our FeedWidget
has been created in a generic manner, we can then include multiple feed widgets easily, as in this example:
| import React from 'react'; |
| import ClockWidget from './clockwidget'; |
| import FeedWidget from './feedwidget'; |
| |
| export default React.createClass({ |
| render() { |
| return ( |
| <div className="dashboard"> |
| <h1 ref="title">{this.props.title}</h1> |
| <div className="wrapper"> |
| <ClockWidget /> |
| <FeedWidget feed="http://feeds.bbci.co.uk/news/rss.xml?edition=uk" size="5" delay="60" /> |
| <FeedWidget feed="https://www.sitepoint.com/feed/" size="10" delay="120" /> |
| </div> |
| </div> |
| ); |
| } |
| }); |
We also need to style our widgets. Save this as scss/_colours.scss
:
| $bgColour: #151515; |
| $txtColour: #cfcfcf; |
| $clockBg: #fa8c00; |
| $clockHoverBg: #0099ff; |
| $clockTxt: #fff; |
| $feedBg: #0099ff; |
| $feedTxt: #fff; |
| $feedHoverBg: #fa8c00; |
And this as scss/style.scss
:
| @import 'colours'; |
| |
| html, body { |
| background-color: $bgColour; |
| color: $txtColour; |
| font-family: Arial, Helvetica, sans-serif; |
| } |
| |
| div.dashboard { |
| padding: 10px; |
| } |
| |
| div.wrapper { |
| -moz-column-count: 4; |
| -webkit-column-count: 4; |
| column-count: 4; |
| -moz-column-gap: 1em; |
| -webkit-column-gap: 1em; |
| column-gap: 1em; |
| } |
| |
| div.widget { |
| display: inline-block; |
| margin: 0 0 1em; |
| width: 100%; |
| min-height: 100px; |
| margin: 5px; |
| opacity: 0.8; |
| transition: opacity 1s; |
| |
| &:hover { |
| opacity: 1; |
| } |
| |
| h2, h4 { |
| padding: 20px; |
| } |
| |
| div.widget-content { |
| width: 100%; |
| } |
| } |
| |
| div.clockwidget { |
| background-color: $clockBg; |
| color: $clockTxt; |
| } |
| |
| div.feedwidget { |
| background-color: $feedBg; |
| color: $feedTxt; |
| |
| h2 { |
| word-wrap: break-word; |
| } |
| |
| ul { |
| margin-left: 0; |
| padding-left: 20px; |
| |
| a { |
| text-decoration: none; |
| padding: 5px; |
| |
| li { |
| list-style-type: none; |
| font-weight: bold; |
| color: $feedTxt; |
| } |
| } |
| } |
| } |
The end result should look something like this:

With that done, feel free to add whatever other feeds you want to include.
Deploying our dashboard
The final step is deploying our dashboard to our Raspberry Pi or other device. Run the following command to generate the Javascript:
$ npm run build
This will create static/bundle.js
. You can then copy that file over to your web server with index.html
and place both files in the web root. I recommend using Nginx if you’re using a Raspberry Pi as it’s faster and simpler for static content. If you’re likely to make a lot of changes you might want to create a command in the scripts
section of your package.json
to deploy the files more easily.
These basic widgets should be enough to get you started. You should be able to use the feed widget with virtually any RSS feed, and you should be able to use a similar approach to poll third-party APIs, although you might need to authenticate in some way (if you do, you won’t want to expose your authentication details, so ensure that nobody from outside the network can view your application). I’ll leave it to you to see what kind of interesting widgets you come up with for your own dashboard, but some ideas to get you started include:
- Public transport schedules/Traffic issues
- Weather reports
- Shopping lists/Todo lists, with HTML5 local storage used to persist them
- Galleries of recent photos on social networks
- Status of servers on cloud hosting providers
With a little thought, you can probably come up with a few more than that! I’ve created a Github repository with the source code so you can check your own implementation against it.
10th August 2016 8:45 pm
If, like me, you’re a web developer who sometimes also has to wear a sysadmin’s hat, then you’ll probably be coming across the same set of tasks each time you set up a new server. These may include:
- Provisioning new servers on cloud hosting providers such as Digital Ocean
- Setting up Cloudflare
- Installing a web server, database and other required packages
- Installing an existing web application, such as Wordpress
- Configuring the firewall and Fail2ban
- Keeping existing servers up to date
These can get tedious and repetitive fairly quickly - who genuinely wants to SSH into each server individually and run the updates regularly? Also, if done manually, there’s a danger of the setup for each server being inconsistent. Shell scripts will do this, but aren’t easy to read and not necessarily easy to adapt to different operating systems. You need a way to be able to manage multiple servers easily, maintain a series of reusable “recipes” and do it all in a way that’s straightforward to read - in other words, a configuration management system.
There are others around, such as Chef, Puppet, and Salt, but my own choice is Ansible. Here’s why I went for Ansible:
- Playbooks and roles are defined as YAML, making them fairly straightforward to read and understand
- It’s written in Python, making it easy to create your own modules that leverage existing Python modules to get things done
- It’s distributed via
pip
, making it easy to install - It doesn’t require you to install anything new on the servers, so you can get started straight away as soon as you can access a new server
- It has modules for interfacing with cloud services such as Digital Ocean and Amazon Web Services
Ansible is very easy to use, but you do still need to know what is actually going on to get the best out of it. It’s intended as a convenient abstraction on top of the underlying commands, not a replacement, and you should know how to do what you want to do manually before you write an Ansible playbook to do it.
Setting up
You need to have Python 2 available. Ansible doesn’t yet support Python 3 (Grr…) so if you’re using an operating system that has switched to Python 3, such as Arch Linux, you’ll need to have Python 2 installed as well. Assuming you have pip
installed, then run this command to install it:
$ sudo pip install ansible
Or for users on systems with Python 3 as the main Python:
$ sudo pip2 install ansible
For Windows users, you’ll want to drop sudo
. On Unix-like OS’s that don’t have sudo
installed, drop it and run the command as root.
Our first Ansible command
We’ll demonstrate Ansible in action with a Vagrant VM. Drop the following Vagrantfile
into your working directory:
| |
| |
| VAGRANTFILE_API_VERSION = "2" |
| |
| Vagrant.configure(VAGRANTFILE_API_VERSION) do |config| |
| config.vm.box = "debian/jessie64" |
| config.vm.network "forwarded_port", guest: 80, host: 8080 |
| end |
Then fire up the VM:
$ vagrant up
This VM will be our test bed for running Ansible. If you prefer, you can use a remote server instead.
Next, we’ll configure Ansible. Save this as ansible.cfg
:
| [defaults] |
| hostfile = inventory |
| remote_user = vagrant |
| private_key_file = .vagrant/machines/default/virtualbox/private_key |
In this case the remote user is vagrant
because we’re using Vagrant, but to manage remote machines you would need to change this to the name of the account that you use on the server. The value of private_key_file
will also normally be something like /home/matthew/.ssh/id_rsa.pub
, but here we’re using the Vagrant-specific key.
Note the hostfile
entry - this points to the list of hosts you want to manage with Ansible. Let’s create this next. Save the following as inventory
:
testserver ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222
Note that we explicitly need to set the port here because we’re using Vagrant. Normally it will default to port 22. A typical entry for a remote server might look like this:
example.com ansible_ssh_host=192.168.56.101
Note also that we can refer to hosts by the name we give it, which can be as meaningful (or not) as you want.
Let’s run our first command:
| $ ansible all -m ping |
| testserver | SUCCESS => { |
| "changed": false, |
| "ping": "pong" |
| } |
We called Ansible with the hosts set to all
, therefore every host in the inventory was contacted. We used the -m
flag to say we were calling a module, and then specified the ping
module. Ansible therefore pinged each server in turn.
We can call ad-hoc commands using the -a
flag, as in this example:
| $ ansible all -a "uptime" |
| testserver | SUCCESS | rc=0 >> |
| 17:26:57 up 19 min, 1 user, load average: 0.00, 0.04, 0.13 |
This command gets the uptime for the server. If you only want to run the command on a single server, you can specify it by name:
| $ ansible testserver -a "uptime" |
| testserver | SUCCESS | rc=0 >> |
| 17:28:21 up 20 min, 1 user, load average: 0.02, 0.04, 0.13 |
Here we specified the server as testserver
. What about if you want to specify more than one server, but not all of them? You can create groups of servers in inventory
, as in this example:
| [webservers] |
| testserver ansible_ssh_host=127.0.0.1 ansible_ssh_port=2222 |
| example.com ansible_ssh_host=192.168.56.101 |
You could then call the following to run the uptime
command on all the servers in the webservers
group:
$ ansible webservers -a 'uptime'
If you want to run the command as a different user, you can do so:
$ ansible webservers -a 'uptime' -u bob
Note that for running uptime
we haven’t specified the -m
flag. This is because the command
module is the default, but it’s very basic and doesn’t support shell variables. For more complex interactions you might need to use the shell
module, as in this example:
| $ ansible testserver -m shell -a 'echo $PATH' |
| testserver | SUCCESS | rc=0 >> |
| /usr/local/bin:/usr/bin:/bin:/usr/games |
For installing a package on Debian or Ubuntu, you might use the apt
module:
| $ ansible testserver -m apt -a "name=git state=present" --become |
| testserver | SUCCESS => { |
| "cache_update_time": 0, |
| "cache_updated": false, |
| "changed": true, |
| "stderr": "", |
| "stdout": "Reading package lists...\nBuilding dependency tree...\nReading state information...\nThe following extra packages will be installed:\n git-man liberror-perl\nSuggested packages:\n git-daemon-run git-daemon-sysvinit git-doc git-el git-email git-gui gitk\n gitweb git-arch git-cvs git-mediawiki git-svn\nThe following NEW packages will be installed:\n git git-man liberror-perl\n0 upgraded, 3 newly installed, 0 to remove and 83 not upgraded.\nNeed to get 4552 kB of archives.\nAfter this operation, 23.5 MB of additional disk space will be used.\nGet:1 http://httpredir.debian.org/debian/ jessie/main liberror-perl all 0.17-1.1 [22.4 kB]\nGet:2 http://httpredir.debian.org/debian/ jessie/main git-man all 1:2.1.4-2.1+deb8u2 [1267 kB]\nGet:3 http://httpredir.debian.org/debian/ jessie/main git amd64 1:2.1.4-2.1+deb8u2 [3262 kB]\nFetched 4552 kB in 1s (3004 kB/s)\nSelecting previously unselected package liberror-perl.\r\n(Reading database ... \r(Reading database ... 5%\r(Reading database ... 10%\r(Reading database ... 15%\r(Reading database ... 20%\r(Reading database ... 25%\r(Reading database ... 30%\r(Reading database ... 35%\r(Reading database ... 40%\r(Reading database ... 45%\r(Reading database ... 50%\r(Reading database ... 55%\r(Reading database ... 60%\r(Reading database ... 65%\r(Reading database ... 70%\r(Reading database ... 75%\r(Reading database ... 80%\r(Reading database ... 85%\r(Reading database ... 90%\r(Reading database ... 95%\r(Reading database ... 100%\r(Reading database ... 32784 files and directories currently installed.)\r\nPreparing to unpack .../liberror-perl_0.17-1.1_all.deb ...\r\nUnpacking liberror-perl (0.17-1.1) ...\r\nSelecting previously unselected package git-man.\r\nPreparing to unpack .../git-man_1%3a2.1.4-2.1+deb8u2_all.deb ...\r\nUnpacking git-man (1:2.1.4-2.1+deb8u2) ...\r\nSelecting previously unselected package git.\r\nPreparing to unpack .../git_1%3a2.1.4-2.1+deb8u2_amd64.deb ...\r\nUnpacking git (1:2.1.4-2.1+deb8u2) ...\r\nProcessing triggers for man-db (2.7.0.2-5) ...\r\nSetting up liberror-perl (0.17-1.1) ...\r\nSetting up git-man (1:2.1.4-2.1+deb8u2) ...\r\nSetting up git (1:2.1.4-2.1+deb8u2) ...\r\n", |
| "stdout_lines": [ |
| "Reading package lists...", |
| "Building dependency tree...", |
| "Reading state information...", |
| "The following extra packages will be installed:", |
| " git-man liberror-perl", |
| "Suggested packages:", |
| " git-daemon-run git-daemon-sysvinit git-doc git-el git-email git-gui gitk", |
| " gitweb git-arch git-cvs git-mediawiki git-svn", |
| "The following NEW packages will be installed:", |
| " git git-man liberror-perl", |
| "0 upgraded, 3 newly installed, 0 to remove and 83 not upgraded.", |
| "Need to get 4552 kB of archives.", |
| "After this operation, 23.5 MB of additional disk space will be used.", |
| "Get:1 http://httpredir.debian.org/debian/ jessie/main liberror-perl all 0.17-1.1 [22.4 kB]", |
| "Get:2 http://httpredir.debian.org/debian/ jessie/main git-man all 1:2.1.4-2.1+deb8u2 [1267 kB]", |
| "Get:3 http://httpredir.debian.org/debian/ jessie/main git amd64 1:2.1.4-2.1+deb8u2 [3262 kB]", |
| "Fetched 4552 kB in 1s (3004 kB/s)", |
| "Selecting previously unselected package liberror-perl.", |
| "(Reading database ... ", |
| "(Reading database ... 5%", |
| "(Reading database ... 10%", |
| "(Reading database ... 15%", |
| "(Reading database ... 20%", |
| "(Reading database ... 25%", |
| "(Reading database ... 30%", |
| "(Reading database ... 35%", |
| "(Reading database ... 40%", |
| "(Reading database ... 45%", |
| "(Reading database ... 50%", |
| "(Reading database ... 55%", |
| "(Reading database ... 60%", |
| "(Reading database ... 65%", |
| "(Reading database ... 70%", |
| "(Reading database ... 75%", |
| "(Reading database ... 80%", |
| "(Reading database ... 85%", |
| "(Reading database ... 90%", |
| "(Reading database ... 95%", |
| "(Reading database ... 100%", |
| "(Reading database ... 32784 files and directories currently installed.)", |
| "Preparing to unpack .../liberror-perl_0.17-1.1_all.deb ...", |
| "Unpacking liberror-perl (0.17-1.1) ...", |
| "Selecting previously unselected package git-man.", |
| "Preparing to unpack .../git-man_1%3a2.1.4-2.1+deb8u2_all.deb ...", |
| "Unpacking git-man (1:2.1.4-2.1+deb8u2) ...", |
| "Selecting previously unselected package git.", |
| "Preparing to unpack .../git_1%3a2.1.4-2.1+deb8u2_amd64.deb ...", |
| "Unpacking git (1:2.1.4-2.1+deb8u2) ...", |
| "Processing triggers for man-db (2.7.0.2-5) ...", |
| "Setting up liberror-perl (0.17-1.1) ...", |
| "Setting up git-man (1:2.1.4-2.1+deb8u2) ...", |
| "Setting up git (1:2.1.4-2.1+deb8u2) ..." |
| ] |
| } |
Here we specify that a particular package should be state=present
or state=absent
. Also, note the --become
flag, which allows us to become root. If you’re using an RPM-based Linux distro, you can use the yum
module in the same way.
Finally, let’s use the git
module to check out a project on the server:
| $ ansible testserver -m git -a "repo=https://github.com/matthewbdaly/django_tutorial_blog_ng.git dest=/home/vagrant/example version=HEAD" |
| testserver | SUCCESS => { |
| "after": "3542098e3b01103db4d9cfc724ba3c71c45cb314", |
| "before": null, |
| "changed": true, |
| "warnings": [] |
| } |
Here we check out a Git repository. We specify the repo, destination and version.
You can call any installed Ansible module in an ad-hoc fashion in the same way. Refer to the documentation for a list of modules.
Playbooks
Ad-hoc commands are useful, but they don’t offer much extra over using SSH. Playbooks allow you to define a repeatable set of commands for a particular use case. In this example, I’ll show you how to write a playbook that does the following:
- Installs and configures Nginx
- Clones the repository for my site into the web root
This is sufficiently complex to demonstrate some more of the functionality of Ansible, while also demonstrating playbooks in action.
Create a new folder called playbooks
, and inside it save the following as sitecopy.yml
:
| --- |
| - name: Copy personal website |
| hosts: testserver |
| become: True |
| tasks: |
| - name: Install Nginx |
| apt: name=nginx update_cache=yes |
| - name: Copy config |
| copy: > |
| src=files/nginx.conf |
| dest=/etc/nginx/sites-available/default |
| - name: Activate config |
| file: > |
| dest=/etc/nginx/sites-enabled/default |
| src=/etc/nginx/sites-available/default |
| state=link |
| - name: Delete /var/www directory |
| file: > |
| path=/var/www |
| state=absent |
| - name: Clone repository |
| git: > |
| repo=https://github.com/matthewbdaly/matthewbdaly.github.io.git |
| dest=/var/www |
| version=HEAD |
| - name: Restart Nginx |
| service: name=nginx state=restarted |
Note the name
fields - these are comments that will show up in the output when each step is run. First we use the apt
module to install Nginx, then we copy over the config file and activate it, then we empty the existing /var/www
and clone the repository, and finally we restart Nginx.
Also, note the following fields:
hosts
defines the hosts affectedbecome
specifies that the commands are run using sudo
We also need to create the config for Nginx. Create the files
directory under playbooks
and save this file as playbooks/files/nginx.conf
:
| server { |
| listen 80 default_server; |
| listen [::]:80 default_server ipv6only=on; |
| |
| root /var/www; |
| index index.html index.htm; |
| |
| server_name localhost; |
| |
| location / { |
| try_files $uri $uri/ =404; |
| } |
| } |
Obviously if your Nginx config will be different, feel free to amend it as necessary. Finally, we run the playbook using the ansible-playbook
command:
| $ ansible-playbook playbooks/sitecopy.yml |
| |
| PLAY [Copy personal website] *************************************************** |
| |
| TASK [setup] ******************************************************************* |
| ok: [testserver] |
| |
| TASK [Install Nginx] *********************************************************** |
| changed: [testserver] |
| |
| TASK [Copy config] ************************************************************* |
| changed: [testserver] |
| |
| TASK [Activate config] ********************************************************* |
| changed: [testserver] |
| |
| TASK [Delete /var/www directory] *********************************************** |
| changed: [testserver] |
| |
| TASK [Clone repository] ******************************************************** |
| changed: [testserver] |
| |
| TASK [Restart Nginx] *********************************************************** |
| changed: [testserver] |
| |
| PLAY RECAP ********************************************************************* |
| testserver : ok=7 changed=6 unreachable=0 failed=0 |
| |
If we had a playbook that we wanted to run on only a subset of the hosts it applied to, we could use the -l
flag, as in this example:
$ ansible-playbook playbooks/sitecopy.yml -l testserver
Using these same basic concepts, you can invoke many different Ansible modules to achieve many different tasks. You can spin up new servers on supported cloud hosting companies, you can set up a known good fail2ban config, you can configure your firewall, and many more tasks. As your playbooks get bigger, it’s worth moving sections into separate roles that get invoked within multiple playbooks, in order to reduce repetition.
Finally, I mentioned earlier that you can use Ansible to update all of your servers regularly. Here’s the playbook I use for that:
| --- |
| - name: Update system |
| hosts: all |
| become: True |
| tasks: |
| - name: update system |
| apt: upgrade=full update_cache=yes |
This connects to all hosts using the all
shortcut we saw earlier, and upgrades all existing packages. Using this method is a lot easier than connecting to each one in turn via SSH and updating it manually.
Summary
Ansible is an extremely useful tool for managing servers, but to get the most out of it you have to put in a fair bit of work reading the documentation and writing your own playbooks for your own use cases. It’s simple to get started with, and if you’re willing to put in the time writing your own playbooks then in the long run you’ll save yourself a lot of time and grief by making it easy to set up new servers and administer existing ones. Hopefully this has given you a taster of what you can do with Ansible - from here on the documentation is worth a look as it lists all of the modules that ship with Ansible. If there’s a particular task you dread, such as setting up a mail server, then Ansible is a very good way to automate that away so it’s easier next time.
My experience is that it’s best to make an effort to try to standardise on two or three different stacks for different purposes, and create Ansible playbooks for those stacks. For instance, I’ve tended to use PHP 5, Apache, MySQL, Memcached and Varnish for Wordpress sites, and PHP 7, Nginx, Redis and PostgreSQL for Laravel sites. That way I know that any sites I build with Laravel will be using that stack. Knowing my servers are more consistent makes it easier to work with them and identify problems.
8th August 2016 5:05 pm
Documenting your API is something most developers agree is generally a Good Thing, but it’s a pain in the backside, and somewhat boring to do. What you really need is a tool that allows you to specify the details of your API before you start work, generate documentation from that specification, and test your implementation against that specification.
Fortunately, such a tool exists. The Blueprint specification allows you to document your API using a Markdown-like syntax. You can then create HTML documentation using a tool like Aglio or Apiary, and test it against your implementation using Dredd.
In this tutorial we’ll implement a very basic REST API using the Lumen framework. We’ll first specify our API, then we’ll implement routes to match the implementation. In the process, we’ll demonstrate the Blueprint specification in action.
Getting started
Assuming you already have PHP 5.6 or better and Composer installed, run the following command to create our Lumen app skeleton:
$ composer create-project --prefer-dist laravel/lumen demoapi
Once it has finished installing, we’ll also need to add the Dredd hooks:
| $ cd demoapi |
| $ composer require ddelnano/dredd-hooks-php |
We need to install Dredd. It’s a Node.js tool, so you’ll need to have that installed. We’ll also install Aglio to generate HTML versions of our documentation:
$ npm install -g aglio dredd
We also need to create a configuration file for Dredd, which you can do by running dredd init
. Or you can just copy the one below:
| dry-run: null |
| hookfiles: null |
| language: php |
| sandbox: false |
| server: 'php -S localhost:3000 -t public/' |
| server-wait: 3 |
| init: false |
| custom: |
| apiaryApiKey: '' |
| names: false |
| only: [] |
| reporter: apiary |
| output: [] |
| header: [] |
| sorted: false |
| user: null |
| inline-errors: false |
| details: false |
| method: [] |
| color: true |
| level: info |
| timestamp: false |
| silent: false |
| path: [] |
| hooks-worker-timeout: 5000 |
| hooks-worker-connect-timeout: 1500 |
| hooks-worker-connect-retry: 500 |
| hooks-worker-after-connect-wait: 100 |
| hooks-worker-term-timeout: 5000 |
| hooks-worker-term-retry: 500 |
| hooks-worker-handler-host: localhost |
| hooks-worker-handler-port: 61321 |
| config: ./dredd.yml |
| blueprint: apiary.apib |
| endpoint: 'http://localhost:3000' |
If you choose to run dredd init
, you’ll see prompts for a number of things, including:
- The server command
- The blueprint file name
- The endpoint
- Any Apiary API key
- The language you want to use
There are Dredd hooks for many languages, so if you’re planning on building a REST API in a language other than PHP, don’t worry - you can still test it with Dredd, you’ll just get prompted to install different hooks.
Note the hookfiles
section, which specifies a hookfile to run during the test in order to set up the API. We’ll touch on that in a moment. Also, note the server
setting - this specifies the command we should call to run the server. In this case we’re using the PHP development server.
If you’re using Apiary with your API (which I highly recommend), you can also set the following parameter to ensure that every time you run Dredd, it submits the results to Apiary:
| custom: |
| apiaryApiKey: <API KEY HERE> |
| apiaryApiName: <API NAME HERE> |
Hookfiles
As mentioned, the hooks allow you to set up your API. In our case, we’ll need to set up some fixtures for our tests. Save this file at tests/dredd/hooks/hookfile.php
:
| <?php |
| |
| use Dredd\Hooks; |
| use Illuminate\Support\Facades\Artisan; |
| |
| require __DIR__ . '/../../../vendor/autoload.php'; |
| |
| $app = require __DIR__ . '/../../../bootstrap/app.php'; |
| |
| $app->make(\Illuminate\Contracts\Console\Kernel::class)->bootstrap(); |
| |
| Hooks::beforeAll(function (&$transaction) use ($app) { |
| putenv('DB_CONNECTION=sqlite'); |
| putenv('DB_DATABASE=:memory:'); |
| Artisan::call('migrate:refresh'); |
| Artisan::call('db:seed'); |
| }); |
| Hooks::beforeEach(function (&$transaction) use ($app) { |
| Artisan::call('migrate:refresh'); |
| Artisan::call('db:seed'); |
| }); |
Before the tests run, we set the environment up to use an in-memory SQLite database. We also migrate and seed the database, so we’re working with a clean database. As part of this tutorial, we’ll create seed files for the fixtures we need in the database.
This hookfile assumes that the user does not need to be authenticated to communicate with the API. If that’s not the case for your API, you may want to include something like this in your hookfile’s beforeEach
callback:
| $user = App\User::first(); |
| $token = JWTAuth::fromUser($user); |
| $transaction->request->headers->Authorization = 'Bearer ' . $token; |
Here we’re using the JWT Auth package for Laravel to authenticate users of our API, and we need to set the Authorization
header to contain a valid JSON web token for the given user. If you’re using a different method, such as HTTP Basic authentication, you’ll need to amend this code to reflect that.
With that done, we need to create the Blueprint file for our API. Recall the following line in dredd.yml
:
blueprint: apiary.apib
This specifies the path to our documentation. Let’s create that file:
$ touch apiary.apib
Once this is done, you should be able to run Dredd:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| warn: Parser warning in file 'apiary.apib': (warning code undefined) Could not recognize API description format. Falling back to API Blueprint by default. |
| info: Beginning Dredd testing... |
| complete: Tests took 619ms |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/4aab4155-cfc4-4fda-983a-fea280933ad4 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
With that done, we’re ready to start work on our API.
Our first route
Dredd is not a testing tool in the usual sense. Under no circumstances should you use it as a substitute for something like PHPUnit - that’s not what it’s for. It’s for ensuring that your documentation and your implementation remain in sync. However, it’s not entirely impractical to use it as a Behaviour-driven development tool in the same vein as Cucumber or Behat - you can use it to plan out the endpoints your API will have, the requests they accept, and the responses they return, and then verify your implementation against the documentation.
We will only have a single endpoint, in order to keep this tutorial as simple and concise as possible. Our endpoint will expose products for a shop, and will allow users to fetch, create, edit and delete products. Note that we won’t be implementing any kind of authentication, which in production is almost certainly not what you want - we’re just going for the simplest possible implementation.
First, we’ll implement getting a list of products:
| FORMAT: 1A |
| |
| # Demo API |
| |
| # Products [/api/products] |
| Product object representation |
| |
| ## Get products [GET /api/products] |
| Get a list of products |
| |
| + Request (application/json) |
| |
| + Response 200 (application/json) |
| + Body |
| |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": 5.99, |
| "attributes": { |
| "colour": "Purple", |
| "size": "Small" |
| } |
| } |
A little explanation is called for. First the FORMAT
section denotes the version of the API. Then, the # Demo API
section denotes the name of the API.
Next, we define the Products
endpoint, followed by our first method. Then we define what should be contained in the request, and what the response should look like. Blueprint is a little more complex than that, but that’s sufficient to get us started.
Then we run dredd
again:
| $ dredd.yml |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| fail: GET /api/products duration: 61ms |
| info: Displaying failed tests... |
| fail: GET /api/products duration: 61ms |
| fail: headers: Header 'content-type' has value 'text/html; charset=UTF-8' instead of 'application/json' |
| body: Can't validate real media type 'text/plain' against expected media type 'application/json'. |
| statusCode: Status code is not '200' |
| |
| request: |
| method: GET |
| uri: /api/products |
| headers: |
| Content-Type: application/json |
| User-Agent: Dredd/1.5.0 (Linux 4.4.0-31-generic; x64) |
| |
| body: |
| |
| |
| |
| expected: |
| headers: |
| Content-Type: application/json |
| |
| body: |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": 5.99, |
| "attributes": { |
| "colour": "Purple", |
| "size": "Small" |
| } |
| } |
| statusCode: 200 |
| |
| |
| actual: |
| statusCode: 404 |
| headers: |
| host: localhost:3000 |
| connection: close |
| x-powered-by: PHP/7.0.8-0ubuntu0.16.04.2 |
| cache-control: no-cache |
| date: Mon, 08 Aug 2016 10:30:33 GMT |
| content-type: text/html; charset=UTF-8 |
| |
| body: |
| <!DOCTYPE html> |
| <html> |
| <head> |
| <meta name="robots" content="noindex,nofollow" /> |
| <style> |
| /* Copyright (c) 2010, Yahoo! Inc. All rights reserved. Code licensed under the BSD License: http://developer.yahoo.com/yui/license.html */ |
| html{color:#000;background:#FFF;}body,div,dl,dt,dd,ul,ol,li,h1,h2,h3,h4,h5,h6,pre,code,form,fieldset,legend,input,textarea,p,blockquote,th,td{margin:0;padding:0;}table{border-collapse:collapse;border-spacing:0;}fieldset,img{border:0;}address,caption,cite,code,dfn,em,strong,th,var{font-style:normal;font-weight:normal;}li{list-style:none;}caption,th{text-align:left;}h1,h2,h3,h4,h5,h6{font-size:100%;font-weight:normal;}q:before,q:after{content:'';}abbr,acronym{border:0;font-variant:normal;}sup{vertical-align:text-top;}sub{vertical-align:text-bottom;}input,textarea,select{font-family:inherit;font-size:inherit;font-weight:inherit;}input,textarea,select{*font-size:100%;}legend{color:#000;} |
| html { background: #eee; padding: 10px } |
| img { border: 0; } |
| #sf-resetcontent { width:970px; margin:0 auto; } |
| .sf-reset { font: 11px Verdana, Arial, sans-serif; color: #333 } |
| .sf-reset .clear { clear:both; height:0; font-size:0; line-height:0; } |
| .sf-reset .clear_fix:after { display:block; height:0; clear:both; visibility:hidden; } |
| .sf-reset .clear_fix { display:inline-block; } |
| .sf-reset * html .clear_fix { height:1%; } |
| .sf-reset .clear_fix { display:block; } |
| .sf-reset, .sf-reset .block { margin: auto } |
| .sf-reset abbr { border-bottom: 1px dotted #000; cursor: help; } |
| .sf-reset p { font-size:14px; line-height:20px; color:#868686; padding-bottom:20px } |
| .sf-reset strong { font-weight:bold; } |
| .sf-reset a { color:#6c6159; cursor: default; } |
| .sf-reset a img { border:none; } |
| .sf-reset a:hover { text-decoration:underline; } |
| .sf-reset em { font-style:italic; } |
| .sf-reset h1, .sf-reset h2 { font: 20px Georgia, "Times New Roman", Times, serif } |
| .sf-reset .exception_counter { background-color: #fff; color: #333; padding: 6px; float: left; margin-right: 10px; float: left; display: block; } |
| .sf-reset .exception_title { margin-left: 3em; margin-bottom: 0.7em; display: block; } |
| .sf-reset .exception_message { margin-left: 3em; display: block; } |
| .sf-reset .traces li { font-size:12px; padding: 2px 4px; list-style-type:decimal; margin-left:20px; } |
| .sf-reset .block { background-color:#FFFFFF; padding:10px 28px; margin-bottom:20px; |
| -webkit-border-bottom-right-radius: 16px; |
| -webkit-border-bottom-left-radius: 16px; |
| -moz-border-radius-bottomright: 16px; |
| -moz-border-radius-bottomleft: 16px; |
| border-bottom-right-radius: 16px; |
| border-bottom-left-radius: 16px; |
| border-bottom:1px solid #ccc; |
| border-right:1px solid #ccc; |
| border-left:1px solid #ccc; |
| } |
| .sf-reset .block_exception { background-color:#ddd; color: #333; padding:20px; |
| -webkit-border-top-left-radius: 16px; |
| -webkit-border-top-right-radius: 16px; |
| -moz-border-radius-topleft: 16px; |
| -moz-border-radius-topright: 16px; |
| border-top-left-radius: 16px; |
| border-top-right-radius: 16px; |
| border-top:1px solid #ccc; |
| border-right:1px solid #ccc; |
| border-left:1px solid #ccc; |
| overflow: hidden; |
| word-wrap: break-word; |
| } |
| .sf-reset a { background:none; color:#868686; text-decoration:none; } |
| .sf-reset a:hover { background:none; color:#313131; text-decoration:underline; } |
| .sf-reset ol { padding: 10px 0; } |
| .sf-reset h1 { background-color:#FFFFFF; padding: 15px 28px; margin-bottom: 20px; |
| -webkit-border-radius: 10px; |
| -moz-border-radius: 10px; |
| border-radius: 10px; |
| border: 1px solid #ccc; |
| } |
| </style> |
| </head> |
| <body> |
| <div id="sf-resetcontent" class="sf-reset"> |
| <h1>Sorry, the page you are looking for could not be found.</h1> |
| <h2 class="block_exception clear_fix"> |
| <span class="exception_counter">1/1</span> |
| <span class="exception_title"><abbr title="Symfony\Component\HttpKernel\Exception\NotFoundHttpException">NotFoundHttpException</abbr> in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 450" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 450</a>:</span> |
| <span class="exception_message"></span> |
| </h2> |
| <div class="block"> |
| <ol class="traces list_exception"> |
| <li> in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 450" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 450</a></li> |
| <li>at <abbr title="Laravel\Lumen\Application">Application</abbr>->handleDispatcherResponse(<em>array</em>('0')) in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 387" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 387</a></li> |
| <li>at <abbr title="Laravel\Lumen\Application">Application</abbr>->Laravel\Lumen\Concerns\{closure}() in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 636" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 636</a></li> |
| <li>at <abbr title="Laravel\Lumen\Application">Application</abbr>->sendThroughPipeline(<em>array</em>(), <em>object</em>(<abbr title="Closure">Closure</abbr>)) in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 389" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 389</a></li> |
| <li>at <abbr title="Laravel\Lumen\Application">Application</abbr>->dispatch(<em>null</em>) in <a title="/home/matthew/Projects/demoapi/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php line 334" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">RoutesRequests.php line 334</a></li> |
| <li>at <abbr title="Laravel\Lumen\Application">Application</abbr>->run() in <a title="/home/matthew/Projects/demoapi/public/index.php line 28" ondblclick="var f=this.innerHTML;this.innerHTML=this.title;this.title=f;">index.php line 28</a></li> |
| </ol> |
| </div> |
| |
| </div> |
| </body> |
| </html> |
| |
| |
| |
| complete: 0 passing, 1 failing, 0 errors, 0 skipped, 1 total |
| complete: Tests took 533ms |
| [Mon Aug 8 11:30:33 2016] 127.0.0.1:44472 [404]: /api/products |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/0153d5bf-6efa-4fdb-b02a-246ddd75cb14 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
Our route is returning HTML, not JSON, and is also raising a 404 error. So let’s fix that. First, let’s create our Product
model at app/Product.php
:
| <?php |
| |
| namespace App; |
| |
| use Illuminate\Database\Eloquent\Model; |
| |
| class Product extends Model |
| { |
| |
| } |
Next, we need to create a migration for the database tables for the Product
model:
| $ php artisan make:migration create_product_table |
| Created Migration: 2016_08_08_105737_create_product_table |
This will create a new file under database/migrations
. Open this file and paste in the following:
| <?php |
| |
| use Illuminate\Database\Schema\Blueprint; |
| use Illuminate\Database\Migrations\Migration; |
| |
| class CreateProductTable extends Migration |
| { |
| |
| * Run the migrations. |
| * |
| * @return void |
| */ |
| public function up() |
| { |
| |
| Schema::create('products', function (Blueprint $table) { |
| $table->increments('id'); |
| $table->string('name'); |
| $table->text('description'); |
| $table->float('price'); |
| $table->json('attributes'); |
| $table->timestamps(); |
| }); |
| } |
| |
| |
| * Reverse the migrations. |
| * |
| * @return void |
| */ |
| public function down() |
| { |
| |
| Schema::drop('products'); |
| } |
| } |
Note that we create fields that map to the attributes our API exposes. Also, note the use of the JSON field. In databases that support it, like PostgreSQL, it uses the native JSON support, otherwise it works like a text field. Next, we run the migration to create the table:
| $ php artisan migrate |
| Migrated: 2016_08_08_105737_create_product_table |
With our model done, we now need to ensure that when Dredd runs, there is some data in the database, so we’ll create a seeder file at database/seeds/ProductSeeder
:
| <?php |
| |
| use Illuminate\Database\Seeder; |
| use Carbon\Carbon; |
| |
| class ProductSeeder extends Seeder |
| { |
| |
| * Run the database seeds. |
| * |
| * @return void |
| */ |
| public function run() |
| { |
| |
| DB::table('products')->insert([ |
| 'name' => 'Purple widget', |
| 'description' => 'A purple widget', |
| 'price' => 5.99, |
| 'attributes' => json_encode([ |
| 'colour' => 'purple', |
| 'size' => 'Small' |
| ]), |
| 'created_at' => Carbon::now(), |
| 'updated_at' => Carbon::now(), |
| ]); |
| } |
| } |
You also need to amend database/seeds/DatabaseSeeder
to call it:
| <?php |
| |
| use Illuminate\Database\Seeder; |
| |
| class DatabaseSeeder extends Seeder |
| { |
| |
| * Run the database seeds. |
| * |
| * @return void |
| */ |
| public function run() |
| { |
| $this->call('ProductSeeder'); |
| } |
| } |
I found I also had to run the following command to find the new seeder:
$ composer dump-autoload
Then, call the seeder:
| $ php artisan db:seed |
| Seeded: ProductSeeder |
We also need to enable Eloquent, as Lumen disables it by default. Uncomment the following line in bootstrap/app.php
:
$app->withEloquent();
With that done, we can move onto the controller.
Creating the controller
Create the following file at app/Http/Controllers/ProductController
:
| <?php |
| |
| namespace App\Http\Controllers; |
| |
| use Illuminate\Http\Request; |
| |
| use App\Product; |
| |
| class ProductController extends Controller |
| { |
| private $product; |
| |
| public function __construct(Product $product) { |
| $this->product = $product; |
| } |
| |
| public function index() |
| { |
| |
| $products = $this->product->all(); |
| |
| |
| return response()->json($products, 200); |
| } |
| } |
This implements the index
route. Note that we inject the Product
instance into the controller. Next, we need to hook it up in app/Http/routes.php
:
| <?php |
| |
| |
| |-------------------------------------------------------------------------- |
| | Application Routes |
| |-------------------------------------------------------------------------- |
| | |
| | Here is where you can register all of the routes for an application. |
| | It is a breeze. Simply tell Lumen the URIs it should respond to |
| | and give it the Closure to call when that URI is requested. |
| | |
| */ |
| |
| $app->get('/api/products', 'ProductController@index'); |
Then we run Dredd again:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| [Mon Aug 8 12:36:28 2016] 127.0.0.1:45466 [200]: /api/products |
| fail: GET /api/products duration: 131ms |
| info: Displaying failed tests... |
| fail: GET /api/products duration: 131ms |
| fail: body: At '' Invalid type: array (expected object) |
| |
| request: |
| method: GET |
| uri: /api/products |
| headers: |
| Content-Type: application/json |
| User-Agent: Dredd/1.5.0 (Linux 4.4.0-31-generic; x64) |
| |
| body: |
| |
| |
| |
| expected: |
| headers: |
| Content-Type: application/json |
| |
| body: |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": 5.99, |
| "attributes": { |
| "colour": "Purple", |
| "size": "Small" |
| } |
| } |
| statusCode: 200 |
| |
| |
| actual: |
| statusCode: 200 |
| headers: |
| host: localhost:3000 |
| connection: close |
| x-powered-by: PHP/7.0.8-0ubuntu0.16.04.2 |
| cache-control: no-cache |
| content-type: application/json |
| date: Mon, 08 Aug 2016 11:36:28 GMT |
| |
| body: |
| [ |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": "5.99", |
| "attributes": "{\"colour\":\"purple\",\"size\":\"Small\"}", |
| "created_at": "2016-08-08 11:32:24", |
| "updated_at": "2016-08-08 11:32:24" |
| } |
| ] |
| |
| |
| |
| complete: 0 passing, 1 failing, 0 errors, 0 skipped, 1 total |
| complete: Tests took 582ms |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/83da2d67-c846-4356-a3b8-4d7c32daa7ef |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
Whoops, looks like we made a mistake here. The index route returns an array of objects, but we’re looking for a single object in the blueprint. We also need to wrap our attributes in quotes, and add the created_at
and updated_at
attributes. Let’s fix the blueprint:
| FORMAT: 1A |
| |
| # Demo API |
| |
| # Products [/api/products] |
| Product object representation |
| |
| ## Get products [GET /api/products] |
| Get a list of products |
| |
| + Request (application/json) |
| |
| + Response 200 (application/json) |
| + Body |
| |
| [ |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"Purple\",\"size\": \"Small\"}", |
| "created_at": "*", |
| "updated_at": "*" |
| } |
| ] |
Let’s run Dredd again:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| pass: GET /api/products duration: 65ms |
| complete: 1 passing, 0 failing, 0 errors, 0 skipped, 1 total |
| complete: Tests took 501ms |
| [Mon Aug 8 13:05:54 2016] 127.0.0.1:45618 [200]: /api/products |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/7c23d4ae-aff2-4daf-bbdf-9fd76fc58b97 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
And now we can see that our test passes.
Next, we’ll implement a test for fetching a single product:
| ## Get a product [GET /api/products/1] |
| Get a single product |
| |
| + Request (application/json) |
| |
| + Response 200 (application/json) |
| + Body |
| |
| { |
| "id": 1, |
| "name": "Purple widget", |
| "description": "A purple widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"Purple\",\"size\": \"Small\"}", |
| "created_at": "*", |
| "updated_at": "*" |
| } |
Note the same basic format - we define the URL that should be fetched, the content of the request, and the response, including the status code.
Let’s hook up our route in app/Http/routes.php
:
$app->get('/api/products/{id}', 'ProductController@show');
And add the show()
method to the controller:
| public function show($id) |
| { |
| |
| $product = $this->product->findOrFail($id); |
| |
| |
| return response()->json($product, 200); |
| } |
Running Dredd again should show this method has been implemented:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| pass: GET /api/products duration: 66ms |
| [Mon Aug 8 13:21:31 2016] 127.0.0.1:45750 [200]: /api/products |
| pass: GET /api/products/1 duration: 17ms |
| complete: 2 passing, 0 failing, 0 errors, 0 skipped, 2 total |
| complete: Tests took 521ms |
| [Mon Aug 8 13:21:31 2016] 127.0.0.1:45752 [200]: /api/products/1 |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/bb6d03c3-8fad-477c-b140-af6e0cc8b96c |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
That’s our read support done. We just need to add support for POST
, PATCH
and DELETE
methods.
Our remaining methods
Let’s set up the test for our POST
method first:
| ## Create products [POST /api/products] |
| Create a new product |
| |
| + name (string) - The product name |
| + description (string) - The product description |
| + price (float) - The product price |
| + attributes (string) - The product attributes |
| |
| + Request (application/json) |
| + Body |
| |
| { |
| "name": "Blue widget", |
| "description": "A blue widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"blue\",\"size\": \"Small\"}" |
| } |
| |
| + Response 201 (application/json) |
| + Body |
| |
| { |
| "id": 2, |
| "name": "Blue widget", |
| "description": "A blue widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"blue\",\"size\": \"Small\"}", |
| "created_at": "*", |
| "updated_at": "*" |
| } |
Note we specify the format of the parameters that should be passed through, and that our status code should be 201, not 200 - this is arguably a more correct choice for creating a resource. Be careful of the whitespace - I had some odd issues with it. Next, we add our route:
$app->post('/api/products', 'ProductController@store');
And the store()
method in the controller:
| public function store(Request $request) |
| { |
| |
| $valid = $this->validate($request, [ |
| 'name' => 'required|string', |
| 'description' => 'required|string', |
| 'price' => 'required|numeric', |
| 'attributes' => 'string', |
| ]); |
| |
| |
| $product = new $this->product; |
| $product->name = $request->input('name'); |
| $product->description = $request->input('description'); |
| $product->price = $request->input('price'); |
| $product->attributes = $request->input('attributes'); |
| |
| |
| $product->save(); |
| |
| |
| return response()->json($product, 201); |
| } |
Note that we validate the attributes, to ensure they are correct and that the required ones exist. Running Dredd again should show the route is now in place:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| pass: GET /api/products duration: 69ms |
| [Mon Aug 8 15:17:35 2016] 127.0.0.1:47316 [200]: /api/products |
| pass: GET /api/products/1 duration: 18ms |
| [Mon Aug 8 15:17:35 2016] 127.0.0.1:47318 [200]: /api/products/1 |
| pass: POST /api/products duration: 42ms |
| complete: 3 passing, 0 failing, 0 errors, 0 skipped, 3 total |
| complete: Tests took 575ms |
| [Mon Aug 8 15:17:35 2016] 127.0.0.1:47322 [201]: /api/products |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/cb5971cf-180d-47ed-abf4-002378941134 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
Next, we’ll implement PATCH
. This targets an existing object, but accepts parameters in the same way as POST
:
| ## Update existing products [PATCH /api/products/1] |
| Update an existing product |
| |
| + name (string) - The product name |
| + description (string) - The product description |
| + price (float) - The product price |
| + attributes (string) - The product attributes |
| |
| + Request (application/json) |
| + Body |
| |
| { |
| "name": "Blue widget", |
| "description": "A blue widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"blue\",\"size\": \"Small\"}" |
| } |
| |
| + Response 200 (application/json) |
| + Body |
| |
| { |
| "id": 2, |
| "name": "Blue widget", |
| "description": "A blue widget", |
| "price": 5.99, |
| "attributes": "{\"colour\": \"blue\",\"size\": \"Small\"}", |
| "created_at": "*", |
| "updated_at": "*" |
| } |
We add our new route:
$app->patch('/api/products/{id}', 'ProductController@update');
And our update()
method:
| public function update(Request $request, $id) |
| { |
| |
| $valid = $this->validate($request, [ |
| 'name' => 'string', |
| 'description' => 'string', |
| 'price' => 'numeric', |
| 'attributes' => 'string', |
| ]); |
| |
| |
| $product = $this->product->findOrFail($id); |
| |
| |
| if ($request->has('name')) { |
| $product->name = $request->input('name'); |
| } |
| if ($request->has('description')) { |
| $product->description = $request->input('description'); |
| } |
| if ($request->has('price')) { |
| $product->price = $request->input('price'); |
| } |
| if ($request->has('attributes')) { |
| $product->attributes = $request->input('attributes'); |
| } |
| |
| |
| $product->save(); |
| |
| |
| return response()->json($product, 200); |
| } |
Here we can’t guarantee every parameter will exist, so we test for it. We run Dredd again:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| pass: GET /api/products duration: 74ms |
| [Mon Aug 8 15:27:14 2016] 127.0.0.1:47464 [200]: /api/products |
| pass: GET /api/products/1 duration: 19ms |
| [Mon Aug 8 15:27:14 2016] 127.0.0.1:47466 [200]: /api/products/1 |
| pass: POST /api/products duration: 36ms |
| [Mon Aug 8 15:27:14 2016] 127.0.0.1:47470 [201]: /api/products |
| [Mon Aug 8 15:27:14 2016] 127.0.0.1:47474 [200]: /api/products/1 |
| pass: PATCH /api/products/1 duration: 34ms |
| complete: 4 passing, 0 failing, 0 errors, 0 skipped, 4 total |
| complete: Tests took 2579ms |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/eae98644-44ad-432f-90fc-5f73fa674f66 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
One last method to implement - the DELETE
method. Add this to apiary.apib
:
| ## Delete products [DELETE /api/products/1] |
| Delete an existing product |
| |
| + Request (application/json) |
| |
| + Response 200 (application/json) |
| + Body |
| |
| { |
| "status": "Deleted" |
| } |
Next, add the route:
$app->delete('/api/products/{id}', 'ProductController@destroy');
And the destroy()
method in the controller:
| public function destroy($id) |
| { |
| |
| $product = $this->product->findOrFail($id); |
| |
| |
| $product->delete(); |
| |
| |
| return response()->json(['status' => 'deleted'], 200); |
| } |
And let’s run Dredd again:
| $ dredd |
| info: Configuration './dredd.yml' found, ignoring other arguments. |
| info: Using apiary reporter. |
| info: Starting server with command: php -S localhost:3000 -t public/ |
| info: Waiting 3 seconds for server command to start... |
| info: Beginning Dredd testing... |
| pass: GET /api/products duration: 66ms |
| [Mon Aug 8 15:57:44 2016] 127.0.0.1:48664 [200]: /api/products |
| pass: GET /api/products/1 duration: 19ms |
| [Mon Aug 8 15:57:44 2016] 127.0.0.1:48666 [200]: /api/products/1 |
| pass: POST /api/products duration: 45ms |
| [Mon Aug 8 15:57:44 2016] 127.0.0.1:48670 [201]: /api/products |
| pass: PATCH /api/products/1 duration: 24ms |
| [Mon Aug 8 15:57:44 2016] 127.0.0.1:48674 [200]: /api/products/1 |
| pass: DELETE /api/products/1 duration: 27ms |
| complete: 5 passing, 0 failing, 0 errors, 0 skipped, 5 total |
| complete: Tests took 713ms |
| [Mon Aug 8 15:57:44 2016] 127.0.0.1:48678 [200]: /api/products/1 |
| complete: See results in Apiary at: https://app.apiary.io/public/tests/run/a3e11d59-1dad-404b-9319-61ca5c0fcd15 |
| info: Sending SIGTERM to the backend server |
| info: Backend server was killed |
Our REST API is now finished.
Generating HTML version of your documentation
Now we have finished documenting and implementing our API, we need to generate an HTML version of it. One way is to use aglio
:
$ aglio -i apiary.apib -o output.html
This will write the documentation to output.html
. There’s also scope for choosing different themes if you wish.
You can also use Apiary, which has the advantage that they’ll create a stub of your API so that if you need to work with the API before it’s finished being implemented, you can use that as a placeholder.
Summary
The Blueprint language is a useful way of documenting your API, and makes it simple enough that it’s hard to weasel out of doing so. It’s worth taking a closer look at the specification as it goes into quite a lot of detail. It’s hard to ensure that the documentation and implementation remain in sync, so it’s a good idea to use Dredd to ensure that any changes you make don’t invalidate the documentation. With Aglio or Apiary, you can easily convert the documentation into a more attractive format.
You’ll find the source code for this demo API on Github, so if you get stuck, take a look at that. I did have a fair few issues with whitespace, so bear that in mind if it behaves oddly. I’ve also noticed a few quirks, such as Dredd not working properly if a route returns a 204 response code, which is why I couldn’t use that for deleting - this appears to be a bug, but hopefully this will be resolved soon.
I’ll say it again, Dredd is not a substitute for proper unit tests, and under no circumstances should you use it as one. However, it can be very useful as a way to plan how your API will work and ensure that it complies with that plan, and to ensure that the implementation and documentation don’t diverge. Used as part of your normal continuous integration setup, Dredd can make sure that any divergence between the docs and the application is picked up on and fixed as quickly as possible, while also making writing documentation less onerous.
5th June 2016 4:32 pm
I use Jenkins as my main continuous integration solution at work, largely for two reasons:
- It generally works out cheaper to host it ourselves than to use one of the paid CI solutions for closed-source projects
- The size of the plugin ecosystem
However, we also use Travis CI for testing one or two open-source projects, and one distinct advantage Travis has is the way you can configure it using a single text file.
With the Pipeline plugin, it’s possible to define the steps required to run your tests in a Jenkinsfile
and then set up a Pipeline job which reads that file from the version control system and runs it accordingly. Here’s a sample Jenkinsfile
for a Laravel project:
| node { |
| |
| stage 'Checkout' |
| |
| |
| git credentialsId: '5239c33e-10ab-4c1b-a4a0-91b96a07955e', url: 'git@bitbucket.org:matthewbdaly/my-app.git' |
| |
| |
| stage 'Install dependencies' |
| |
| |
| sh 'composer install' |
| |
| |
| stage 'Test' |
| |
| |
| sh "vendor/bin/phpunit" |
| } |
Note the steps it’s broken down into:
stage
defines the start of a new stage in the buildgit
defines a point where we check out the code from the repositorysh
defines a point where we run a shell command
Using these three commands it’s straightforward to define a fairly simple build process for your application in a way that’s more easily repeatable when creating new projects - for instance, you can copy this over to a new project and change the source repository URL and you’re pretty much ready to go.
Unfortunately, support for the Pipeline plugin is missing from a lot of Jenkins plugins - for instance, I can’t publish the XML coverage reports. This is something of a deal-breaker for most of my projects as I use these kind of report plugins a lot - it’s one of the reasons I chose Jenkins over Travis. Still, this is definitely a big step forward, and if you don’t need this kind of reporting then there’s no reason not to consider using the Pipeline plugin for your Jenkins jobs. Hopefully in future more plugins will be amended to work with Pipeline so that it’s more widely usable.
22nd May 2016 11:29 pm
You may have heard of Google’s AMP Project, which allows you to create mobile-optimized pages using a subset of HTML. After seeing the sheer speed at which you can load an AMP page (practically instantaneous in many cases), I was eager to see if I could apply it to my own site.
I still wanted to retain the existing functionality for my site, such as comments and search, so I elected not to rewrite the whole thing to make it AMP-compliant. Instead, I opted to create AMP versions of every blog post, and link to them from the original. This preserves the advantages of AMP since search engines will be able to discover it from the header of the original, while allowing those wanting a richer experience to view the original, where the comments are hosted. You can now view the AMP version of any post by appending amp/
to its URL.
The biggest problem was the images in the post body, as the <img>
tag needs to be replaced by the <amp-img>
tag, which also requires an explicit height and width. I wound up amending the renderer for AMP pages to render an image tag as an empty string, since I have only ever used one image in the post body and I think I can live without them.
It’s also a bit of a pain styling it as it will be awkward to use Bootstrap. I’ve therefore opted to skip Bootstrap for now and write my own fairly basic theme for the AMP pages instead.
It’ll be interesting to see what effect having the AMP versions of the pages available will have on my site in terms of search results. It obviously takes some time before the page gets crawled, and until then the AMP version won’t be served from the CDN used by AMP, so I really can’t guess what effect it will have right now.