Containers (not ioc)

And not docker either, for that matter.

After doing this for a while I’m realizing that this functional programming in javascript is basically like porting Haskell.  Or as much of it as our language will allow.  Building up the pieces parts that allow you to compose functions in a friendly sort of way.  Not that I know Haskell or anything about it, just sayin.

The first step to building functions that are easily composed is to your take value and put it in a container that can be treated the same way no matter what is inside it.  This homogenizing of values is similar to standarization that took place at the beginning of the industrial revolution.  Any container of a specific type can be treated the same way regardless of the contents within.  This may not be the best analogy, or even 100% accurate but I like it.

The simplest container ( more commonly and heretofore known as functors ) is the identity functor.   It looks like this

a -> a
function takes a type of "a" and returns a type of "a"
function(a) { return a; };

not very interesting, admittedly.  However, the definition of a functor, dumbed down for me, is an object ( here a value wrapped in a function ) that can be mapped to another object of the same type paraphrased from here.  it is typically shown as

(a->b) -> functor(a) -> functor(b)

So for instance

var whadYouSay = Identity("hell").map(x => x + "o!");
==> Identity("hello!")

A nice image of this can be found in this post which is probably a better article to read than this one.

chart

This shows an increment function being mapped over an identity of 1.

This is the basic building block upon which we can build an empire.  This notion of encapsulation, applicable rules and the idea of currying will allow us to build compositions that are agnostic of the content and immutable

In my next post I’ll write about Prop, SafeProp and Maybe.

 

 

 

Advertisements
Containers (not ioc)

Functional Programming in Javascript

In this post I will attempt to articulate the reasons and goals I have for using Functional style programming in javascript.

I’m building a distributed event sourcing system in node.  The part I’m focusing on now, is the event dispatcher.  It receives events from the eventstore, determines if they are relevant and hands them to the appropriate eventhandler.

While I have already written the piece, I wrote it in an OO manner with inheritance, fancy es6 features, and lots of imperative code.  I found that there were so many small pieces of fairly simple logic, that it was very easy to make mistakes or miss possible error scenarios.  I wrote shit tons of logging code and ran complex unit and integration tests and watched the logging output.

My pain points were lots of if-then logic, lots of null check logic, lots of stream parsing logic, lots of objects with state and functionality, and trouble with inheritance using es6 classes.

Functional programming, as I started reading up on it, addressed each of these issues.

Using Maybe I avoided ( deferred and centralized really ) if-then logic and null check logic, and error handling .  My functions were more numerous, but smaller, stateless and easy to test.  I found that I could build up a workflow reusing the composite pieces then simply call the function rather than invoke the behavior from the various pieces that contained the state.  As I was not mutating state this also reduced the mental overhead of the app.

In the next post I will start to introduce the pieces that made this possible.  If this is a topic that you are interested in, I invite you to visit Functional Programming in Javascript a google group for discussing this style of programming.

Functional Programming in Javascript

Update all npm packages to the latest version

Fun with bash.  This is a command that will update all of your packages that are behind, to the latest version.  I know you can do npm update but this doesn’t seem to do it.  For some reason.  Whatever, this was fun.

npm outdated | awk 'NR!=1&&$4!="git" {printf "%s%s%s\n", $1,"@", $4}' | xargs -t -n1 npm update --save

npm outdated will give a result like this

Package Current Wanted Latest Location
co-body 4.0.0 1.2.0 4.0.0 co-body
co-views 2.1.0 0.3.0 2.1.0 co-views
corelogger 0.0.1 git git corelogger
dagon 0.2.1 git git dagon
del 2.0.2 1.2.1 2.0.2 del
eventstore 0.0.1 git git eventstore
koa 1.0.0 0.21.0 1.0.0 koa
koa-generic-session 1.9.2 1.8.0 1.9.2 koa-generic-session
koa-logger 1.3.0 1.2.2 1.3.0 koa-logger
koa-passport 1.2.0 1.1.6 1.2.0 koa-passport
koa-router 5.0.1 5.0.1 5.2.3 koa-router
mocha 2.3.3 2.1.0 2.3.3 mocha
must 0.13.1 0.13.0-beta2 0.13.1 must
path 0.11.14 0.11.14 0.12.7 path
readstorerepository 0.0.1 git git readstorerepository
rx 2.5.3 2.5.3 4.0.0 rx

Christ.  That’s a five column table that looks a lot better in your terminal.

awk 'NR!=1&&$4!="git"

This line says NR!=1 ignore the first line which is just the headers. AND remove lines where the fourth column is “git”.  This will be the case if you have dependencies that are in git and not in npm

{printf "%s%s%s\n", $1,"@", $4}'

This says print column 1 then “@” then column 4 then a new line.  If you don’t do it with the printf which is a print format I presume you end up with co-body @ 4.0.0 rather that co-body@4.0.0 which npm wont like.  So now you will have

co-body@4.0.0
co-views@2.1.0
del@2.0.2
koa@1.0.0
koa-generic-session@1.9.2
koa-logger@1.3.0
koa-passport@1.2.0
koa-router@5.2.3
mocha@2.3.3
must@0.13.1

Pipe this into

xargs -t -n1 npm update --save

Which says, for each line do npm update arg and print the line out so we can see what’s going on. xargs -n1 will print each line of the input only once per command ( the npm update ).  the -t is verbose ( shows the line before running it ) .  I guess you could do it with out the -n1 actually, but whatever.  –save will of course update your package.json.

Now if anyone reads this they may comment “oh here’s how you do it with npm in 5 characters” and I’ll feel dumb, but this was fun anyway.

Update all npm packages to the latest version

Managing node modules and files. Part II

In my last post I discussed how I had started building an application with different parts that were all broken in to SRP files that exported a value, and thus were modules.  I discussed how I refactored this application so that each part was an autonomous module, with it’s own repo, package.json and entry point.

To be fair (to myself), my over all app was made up of several autonomous parts, but each part had several sub parts. For instance maybe one part connected to two different databases.  Both connections were used in the same app.  Separate file/modules and such but not in separate repos with their own package.json.

Ok, moving on I also spoke about how backlash against the idea of a Dependency Container was one reason I did this refactor.  Now I’d like to speak about the differences and similarities from a IOC perspective.

Some modules need to be configured, whether they are in their own repo or just another file.  For instance a database needs a connection string, your logger needs a level set and perhaps some application variables, a reusable module might need some information about what sort of entities it’s working with.  This is state, I guess, and what I have heard often is that modules should be stateless.  Most of my modules are stateless, but you can’t get around the fact that your application needs an input and an output.  If this is a web server then that is nice and tidy at the top level, but for worker modules and these are often a database or a queue and environment specific.

So what ends up happening is that you have some config values set in the top consuming app and passed to the modules that need them.  However, in the case of the database connection, this module is often rather deep.  The only way to get the values to their target is to require all your dependencies that need to be configured at the top level app, configure them and the pass them in as parameters to the dependencies that need them.  This is IOC.  Back in the C# world it’s what we called the poor man’s IOC.  The end result for one of my application services looks like this

var _eventDispatcher = require('eventDispatcher');
var _readStoreRepository = require('readStoreRepository');
var _eventHandlerBase = require('eventHandlerBase');
var _eventRepository = require('eventRepository');
var commandHandlers = require('./CommandHandlers/index');
var _eventStore = require('eventStore');
var yowlWrapper = require('yowlWrapper');
var extend = require('extend');
var config = require('config');

module.exports = function(_options){
    var options = config.get('myOptions');
    extend(options, _options || {});

    var logger = yowlWrapper(options.logger);
    var eventStore = _eventStore({eventstore:options.eventstore});
    var readStoreRepository = _readStoreRepository(
        {postgres: options.postgres});

    var eventRepository = _eventRepository(eventStore);
    var eventHandlerBase = _eventHandlerBase(eventStore,
                                             readStoreRepository);

    var handlers = commandHandlers(eventHandlerBase,
                                   eventRepository,
                                   readStoreRepository,
                                   logger);

    var eventDispatcher = _eventDispatcher(handlers, 
                                      eventStore,
                                     {targetStreamType:'command'});
    eventDispatcher.startDispatching();
};

As you can see the eventStore and the readStore are the bottom level modules but they are the ones that need the configuration.  They are then handed to the mid level modules, the repository and the handlers, which are in turn handed to the top level module all instantiated configured an ready to go.

In contrast, with my IOC Container my set up looked like this

module.exports = function(gesDispatcher, commandHandlers, logger) {
    return function () {
        logger.debug('instantiating dispatcher');
        var dispatcher = new gesDispatcher({
            targetType: 'command',
            handlers: commandHandlers.map(x=>new x())
        });
        dispatcher.startDispatching();
    }
};

It was pointed out that I should really show the config form the container for a proper comparison.  So here is the bootstrapper from my old code.

var _container = require('dagon');
var config = require('config');
module.exports = new _container(x=>
    x.pathToRoot(__dirname)
        .requireDirectoryRecursively('./src')
        .groupAllInDirectory('./src/CommandHandlers',
                              'commandHandlers')
        .for('logger').require('YowlWrapper')
        .for('logger')
             .instantiate(x=>x.asFunc()
                             .withParameters(config.get('logger')))
        .for('gesConnection')
              .instantiate(x=>x.asFunc()
                        .withParameters(config.get('eventStore')))
        .for('readModelRepository')
              .instantiate(x=>x.asFunc()
                        .withParameters(config.get('postgres')))
        .rename('lodash').withThis('_')
        .rename('bluebird').withThis('Promise')
        .complete());

This would not currently work with dagon and my new module pattern.  It worked the way I had it before.  Plus it could be a bit more elegant than requiring the config module.  But dagon is on my back burner right now.  I like the new stuff I’m learning and I may as well let it all shake out before I start thinking about making it better.

Poor mans IOC sounds … a bit pejorative I guess.  But in node with very small and autonomous pieces it’s not nearly that bad.  In fact, you could argue that it’s more explicit.  I can now accomplish all the things that I found so hard to do before.   The things that a container wrapped up and did for me.

So my ultimate ( as of this time 🙂 ) conclusion is this:  First, I was composing my modules poorly.  This led me to feel that I was unable to access pieces that I needed to configure dynamic variables.  Second, that by doing this manual IOC I could configure everything explicitly and stub things out if necessary during testing with out digging way down into it.  And third, that, while I had started this post as an explanation for why I don’t need a container, in fact, I think that a container does make things bit cleaner.

Boom, there, I said it.  Flip flopped right in my own posting! Still the exploration has lead to a nice pattern of making SRP modules explicit by breaking them out into there own repos and composing them more clearly.  I also feel that I could very easily use a container or manual IOC and still develop an application with the ability to access and substitute modules at runtime.

Managing node modules and files. Part II

Managing node modules and files. Part I

Recently I wrote a dependency injection container for node. DAGon (npm wont let you use caps so dagon).  It worked quite nicely and I really liked using it in my app. However, node people have stated at every turn that such a thing is anathema.

The reason they say is because you should write small modules that do one thing SRP.  Well this was a bit confusing for me.  You see each file that you write in node exports a value, either a function or an object and is called a module.  And in my rather large application, each of those files did just one thing.  They were in fact modules that were SRP.  This didn’t help me with the fact that I needed to reference modules everywhere using relative paths.  And that those modules needed to be configured and so on.  The case for a container that handles all that seemed stronger and stronger.

However, if you hit me in the head enough times, at the right angle, I sometimes learn something.  It seems that the word module is kind of overloaded.  Yes your file is a module, but I think what people are referring to is a module which is in it’s own repository, with it’s own package.json and index.js ( or starting file ).  This module is still small and SRP so the relative paths are right next door.  Usually in the same folder.

Now you require your truly autonomous modules in your top level app.  The small pieces are easier to test because by necessity, they are not entwined with other parts of the app. The top level app acts sort of like a controller that orchestrates these other pieces.

Well I decided to bend to the pressure and my new understanding and I rewrote my app using this pattern.  I’ll summarize that process as follows: The refactor was pretty easy.  Fixing 7 million tests, not so much.  Tests tend to be tied to the implementation at least at a very low level, and as the implementation had fundamentally changed all my tests were pooched.  Still it felt kind of good to have these highly discrete modules.

In my next post I’ll talk about the difference between building the app, in this manner versus building it with a DI Container.

Managing node modules and files. Part I

Getting some keyboard shortcuts to work on ubuntu in parallels

I have found a strange idiosyncrasy of running Ubuntu on a  mac in parallels. A lot of my keyboard shortcuts don’t work.  For instance in a browser alt back arrow key doesn’t work.  Or in Webstorm alt insert doesn’t bring up the add a file menu.

What I have found, at least for those two is that if you hit the combo twice in succession it actually works. Consistently. Strange but workable.  I don’t know how many of the other shortcuts will work if I try this.  Ususally just pound on it get frustrated and grab the mouse.  I’ll have to be more patient and try.  So we’ll see.

Getting some keyboard shortcuts to work on ubuntu in parallels

Intro

Trying a fresh start with a new blog.

I am a software developer. I have spent the past 10 years developing in C# and suffering horribly under the yoke of both the windows operating system, and the visual studio development environment.  While I have tried at every turn to avoid the bloat, highly recommended bad practices and shitty features that are offered up as a shining path by the evangelists I feel that I can no longer endure this existence.

No, I’m not going to snuff it.  now. I’m switching my attention to the world outside the walled shit hole.  I imagine that North Korea is a great analogy for the Microsoft development platform.  You are told that what you have is the best of all possible worlds.  Surrounded by voices echoing the party line.  Told how lucky you are to live in such filth. All the while enjoying a standard of living not unlike that enjoyed by byzantine peasants. Once you break free you see that your neighbors enjoy running water, running software, indoor toilets.

I have enjoyed working in javascript for sometime, and serving it up via .net apis deployed to windows server environments.  Now that I am a defector I am enjoying it all the more.  Learning ubuntu has been a joy, writing software in nodejs even more enjoyable now that I don’t have to always take into consideration that I’m using windows. Deploying to containers that can be published to various clouds is both exhilarating and inspiring.  And while Microsoft is working hard to create watered down crappy implementations of all the things that make not working with windows great, I have seen their handy work in the past and prefer to take whatever alternative there may be.

I have worked for years building up tools, skills, and patterns for working in .net.  I am sad to leave the little niche of sanity that I’ve been able to carve out of that huge block of mediocrity. But the future is bright and I am loving building up new patterns, skills and tools.

I will be writing here, about lessons learned, gotchas, and cool tools that I discover along the way.

Intro