Showing posts from 2017

Simple build tooling for frontend web applications (gulp demo)

Please read this article in the GitHub Source repo for full context. TL;DR: Why it pays to use professional tooling even for small and insignificant projects. I used to write my build tools in Bash and "automate" stuff via Makefiles. I used to create my websites manually without any build tool (since I just edit the HTML, CSS and JS files directly). It turns out that this is actually a big waste of my time. It also prevents me from adopting standard solutions for common problems. A specific example is the problem of proxies and browsers caching static asstes like CSS and JS files. The symptom is that I have to push repeatedly F5 to see a change in my code. The best practice solution is to change the filename of the static asset each time the  content changes . Without automation this is already way beyond my manual editing so that I so far didn't use this simple trick. This little demo project for a static website shows how easy it is actually to setup and u

Meaningful Versions with Continuous Everything

Q: How should I version my software? A: Automated! All continuous delivery processes follow the same basic pattern: Engineers working on source code, configuration or other content commit their work into a git repository (or another version control system, git is used here as an example). A build system is triggered with the new git commit revision and creates binary and deployment artefacts and also applies the deployments. Although this pattern exists in many different flavors, at the core it is always the same concept. When we think about creating a version string the following requirements apply: Every change in any of the involved repositories or systems must l ead to a new version to ensure traceability of changes. A new version must be sorted lexicographically after all previous versions to ensure reliable updates. Versions must be independent of the process execution times  (e.g. in the case of overlapping builds) to ensure a strict ordering of the artefact

Favor Dependencies over Includes

Most deployment tools support dependencies or includes or even both. In most cases one should use dependencies and not includes — here is why. Includes work on the implementation level and dependencies work at the level of functionality. As we strive to modularize everything, dependencies have the benefit of separating between the implementations while includes actually couple the implementations. Dependencies work by providing an interface of functionality that the users can rely upon - even if the implementation changes over time. They also create an abstraction layer (the dependency tree) that can be used to describe a large system of components via a short list of names and their dependencies. Dependencies therefore allow us to focus on smaller building blocks without worrying much about the other parts. To conclude, please use dependencies if possible. They will give you a clean and maintainable system design.

Web UI Testing Made Easy with Zalenium

I so far was always afraid to mess with UI tests and SeleniumHQ . Thanks to Zalenium , a dockerized "it just works" Selenium Grid from Zalando, I finally managed to start writing UI tests. Zalenium takes away all the pain of setting up Selenium with suitable browsers and keeps all of that nicely contained within Docker containers ( docker-selenium ). Zalenium also handles spawning more browser containers on demand and even integrates with cloud-based selenium providers ( Sauce Labs , BrowserStack , TestingBot ). To demonstrate how easy it is to get started I setup a little demo project  (written in Python with Flask ) that you can use for inspiration. The target application is developed and tested on the local machine (or a build agent) while the test browsers run in Docker and are completely independent from my desktop browser: A major challenge for this setup is accessing the application that runs on the host from within the Docker containers. Dockers network isolat

Setting Custom Page Size in Google Docs - My First Published Google Apps Script Add-On

While Google Docs is a great productivity tool, it still lacks some very simple and common functionality, for example setting a custom page size. Google Slides and Google Drawings allows setting custom sizes, but not Google Docs . Luckily there are several add-ons available for this purpose, for example Page Sizer is a little open source add-on on the Chrome Web Store . Unfortunately in many enterprise setups of G Suite access to the Chrome Web Store and to Google Drive add-ons is disabled for security reasons: the admins cannot white-list single add-ons and are afraid of add-ons that leak company data. Admins can only white list add-ons from the G Suite Marketplace . The Google Apps Script code to change the page size is actually really simple, for example to set the page size to A1 you need only this single line of code: DocumentApp. getActiveDocument(). getBody(). setAttributes({ "PAGE_WIDTH": 1684, "PAGE_HEIGHT": 2384 }); To solv

Eliminating the Password of Shared Accounts

Following up on " Lifting the Curse of Static Credentials ", everybody should look closely at how they handle shared accounts, robot users or technical logins. Do you really rotate passwords, tokens and keys each time somebody who had access to the account leaves your team or the company? Do you know who has access? How do you know that they didn't pass on those credentials or put them in an unsafe place? For all intents and purposes, a shared account is like anonymous access for your employees. If something bad happens, the perpetrator can point to the group and deny everything. As an employer you will find it nearly impossible to prove who actually used the password that was known to so many. Or even to prove that it was one of your own employees and not an outside attacker who "somehow" stole the credentials. Thanks to identity federation and federated login protocols like SAML2 and OpenID Connect it is now much easier to completely eliminate pass