A previous post described how to use d3.js to create a force-directed graph we can zoom in to and select nodes from. Such a tool is useful for displaying and arranging larger networks. My colleagues and I personally used it to create a small web application for displaying RNA secondary structure.
Since that post, a new version of D3 was released. D3 V4 introduced a lot of useful new features (see Irene Ros’s excellent overview of the differences between v3 and v4). Unfortunately, however, it did not maintain backward compatiblity with previous versions of d3. This means that the previous selectable zoomable force directed graph example could not be used with new code written with the latest version of the D3 library. Until now.
As in the previous example, this graph provides the following selection behavior:
Upgrading the selectable zoomable force directed graph implementation to D3 v4 required a few minor and not-so-minor changes.
d3-brush
and modified it
so that it doesn’t capture the shift events. The new version (d3-brush-lite)
can be found on github. There
is an open github issue to
disable this behavior in d3-brush
..fx
and .fy
parameters. This
eliminates the need to set the .fixed
parameter on each node.{
"nodes": [
{"id": "Myriel", "group": 1},
{"id": "Napoleon", "group": 1},
{"id": "Mlle.Baptistine", "group": 1},
...
],
"links": [
{"source": "Napoleon", "target": "Myriel", "value": 1},
{"source": "Mlle.Baptistine", "target": "Myriel", "value": 8},
...
]
}
The source code for this example can be found as a github gist or on bl.ocks.org.
D3 behaviors, such as d3.zoom
, work by responding to events which pass
through the element on which they are called. If the element has children, the
behavior will be called as long as the children don’t block events’
propagation. This is often beneficial. If we want to be able to zoom on a
populated SVG, we need only call the zoom behavior on the root node and we’ll
be able to pan and zoom even if we drag and scroll on the child elements.
There are times, however, when we may want to ignore certain elements without having the block the propagation of the event. For this, there is event filtering. By filtering events, we can let them pass through without having to block or process them. This can be seen in the example below where dragging the background leads to panning, while dragging the circles has no effect.
The crux of the code for this example is a simple check to see that handled
events have not passed though an element with a no-zoom
class.
var zoom = d3v4.zoom()
.filter(() => { return !d3v4.event.path[0].classList.contains('no-zoom') })
.on('zoom', function(d) { g.attr('transform', d3v4.event.transform); });
A bl.ock of this example can be found here.
Switching from gulp
and webpack-stream
to webpack-dev-server
reduces
the rebuild time for a 5500-line javascript project from ~11s to ~1.3 seconds.
Whenever I create a javascript project, I do it using a very uniform directory
structure and configuration, as outlined in a previous blog
post. With this configuration,
all the source files are transpiled using babel
and bundled using the
webpack-stream
module as part of a step in the build process managed by
gulp
.
This is great because then I can run gulp serve
and have it recompile and
reload the resulting web page whenever I make any changes to the source code in
app/scripts
.
This works like a charm until the source code and dependencies get to any appreciable size. As more and more files need to be transpiled, the process gets slower and slower until at about ~10 seconds, it starts to get annoying:
[BS] 3 files changed (main.js, playground.js, worker.js)
[08:31:20] Finished 'scripts' after 11 s
So how can this be sped up? Easy, stop using gulp and webpack-stream and switch to the…
The webpack dev server runs in its own terminal and watches the source files
listed in its config file (webpack.config.js
). When one of the files changes, it
recreates the output files specified in its config and reloads the web page. I
run it using the following command line:
webpack-dev-server --content-base app --display-exclude --profile --inline | grep -v "\\[\\d\*\\]"
The grep at the end is to filter out some of the [overly] verbose output that webpack produces. So how long does it take to regenerate the code when a source file is changed?
Version: webpack 1.12.15
Time: 1296ms
chunk {0} main.js (main) 4.61 MB
This is about 10x faster than the configuration using gulp and webpack-stream.
The resulting web page can be found at
http://localhost:8080/index.html
The only thing I needed
to change in my webpack.config.js
file was to add output: { publicPath:
'/scripts/'}
. This is because my index.html
file loads the compiled scripts
from the scripts
directory:
<script src='scripts/playground.js'></script>
Below is the entire webpack.config.js
for this project. Notice that there’s
multiple different targets being built including a worker script that can be
used in a web worker to do compute intensive tasks off of the main UI thread.
Other notable sights include the devtool: "cheap-source-map"
entry to make sure
we can easily see the source code when debugging.
var path = require('path');
var webpack = require('webpack');
module.exports = {
context: __dirname + '/app',
entry: {
playground: ['./scripts/playground.jsx'],
main: ['./scripts/main.jsx'],
worker: ['./scripts/worker.js']
},
devtool: "cheap-source-map",
output: {
path: __dirname + '/build',
publicPath: '/scripts/',
filename: '[name].js',
libraryTarget: 'umd',
library: '[name]'
},
module: {
loaders: [
{
test: /\.jsx?$/,
//exclude: /node_modules/,
include: [path.resolve(__dirname, 'app/scripts')],
loader: 'babel-loader',
query: {
presets: ['es2015', 'react']
}
}, {
test: /\.css$/,
loader: 'style!css'
}
],
postLoaders: [
{
include: path.resolve(__dirname, 'node_modules/pixi.js'),
loader: 'transform?brfs'
}
],
externals: {
},
resolve: {
extensions: ['.js', '.jsx']
}
}
};