trollmehard.co.uk

This is taking place after a review undertaken in 2013 by former Director of Public Prosecution Lord Ken Macdonald concluded that while pre-register screenings are not feasible, leading to far too many false-positives and thus undermining the UK government’s support of free speech (and the European Convention), a post-registration filtering and screening process indeed would be feasible and would help Nominate to protect the public from being exposed to domain names like rapehimnow.co.uk.

The review indicates that 20 to 25 newly registered domains per week would need screening and of those none would usually be de-registered/reported to Police.

Lord Macdonald also hints at the obvious problems anyone who wants to automatically filter “bad” words faces: False-positives & context assessment. He even states some of the more famous examples of these issues in relation to domain names such as penisland.com.

While Macdonald repeatedly stresses the important of free speech in general and the question of taste in regard to censorship, he caves in (in chapter seven) and postulates that post-registration filtering and screening to weed out the odd, possibly unlawful domain name is worth it.

B for effort.

Side-notes

Originally aimed at “unlawful” domain names, Macdonald at one point concludes that ONLY domains with words related to sex crimes could possibly be filtered and screened. All other crimes are just to hard to detect. Huh.

In chapters 8 and 9, Macdonald explains PRSS (an automated system open to everyone who pays 400 GBP/a plus a large group of NGO and executive offices with which to query the .uk zone file) and defends (or tries to defend, depending on your point of view) why the Dispute Resolution Service DRS is not good enough to help in Nominate ongoing effort to keep the .uk zone file clean.

Read more at BBC, Nominate.

Photo by Pete Markham, CC-BY-SA.

First looks

For the 2013 installment of their Mobile Effects study Tomorrow Focus Media asked users about what is important to them when they decide on downloading/buying an app.

For the 519 users answering this question, usability and utility are the top factors, more important than price, reviews in app stores and friends’ opinions, underlining the importance of what a customers can see of your app in the app store.

The key is that users infer usability and utility from the scarce information visible on the app stores. Which, in turn, is why you need to be aware to make good use of these tiny pieces of screen estate.

(37signals’ Travis Jeffrey has some great examples for app screenshots.)

From Social Media to Instant Messaging

Somewhen in 2010 we started to ask ourselves which service would surpass (in usage, eyeballs, clicks, whatever the metric) Facebook and when. This question is important not only to FB, but to anyone carvin a living out of the social network.

Strong contenders nowadays are instant messengers (remember ICQ?). With the advent of smarter mobile devices and cheap data plans, usage of 3rd-party IM to replace costly text messaging surges. Plus, since our Moms and Dads have their own Facebook nowadays (plateau of productivity reached), most importantly teens left the platform aside and went on the search for greener pastures. They found this place with IM.

I do not link to all the rants, analysis and commentary on the “demise of Facebook”. Also, this is not about the behavioral shift from many to many to one to one that underlines the surge in IM usage. Both points are interesting enough and I might write about them again, but they are out of scope for this article.

Facebook is an ecosystem. Advertising agencies, media agencies, social media experts and software developers, to name a few, all generate revenue from it. The point I am trying to make: They won’t be able to do this to the same extent via IM. Go figure.

(Oh: Can we, some 17 years later, drop the “Instant”?)

Photo by Volker Lannert / General-Anzeiger, depicting Friedhelm Hillebrand, one of the inventors of short message services.

SPDY, responsive performance. Fast is never fast enough.

In the good old days you got pretty far by combining CDN and compression to deliver your frontend code. Throw in sub-genres like cookie-less domains, minimization, caching and some DNS trickery, and you get the picture.

Early in 2013 I did a ramp-up talk at Razorfish in Berlin why this is important, what you can achieve and which tools help you do it. Back then there was no reason, in a Frontend performance 101, to go into lengths about SPDY or HTTP/2.0.

Things have changed. SPDY get adopted by some major players out there, exposing it to many smart people, some of them even writing about it.
Here is a piece by CloudFlare’s John Graham-Cumming, who sheds some light into the difference SPDY makes and why we need to rethink some of our current toolkit to get the best out of it. Thanks for sharing John.

Mark Zeman is a digital Creative Director by day and founder of SpeedCurve by night. When he writes about the performance of responsive websites, he does do some pitching for his product, but the article is worthwhile anyway, mostly because of the pointers to other interesting material. Have fun.

Adding complexity

It’s the passion that makes us lose sight of the danger looming ahead, the trap we’re edging towards thanks to our subjective assumptions and vague speculation, the trap of building a over-designed and over-complicated system for its own sake. # *

Most of what we develop is very complex yet at the same time it is just glorified text processing and arithmetics.

Complexity in software has two roots. One is the inherent complexity of our surroundings. It is amplified by the need of software to work on the assumption that something is either true or false, where in the physical world this black or white decision is not always possible. The other is bit esoteric: the acquired complexity we think we need in order to model the undecided physical world in binary. We usually are not in a position to understand the full scope of what we are working on. We assume and estimate, but we do not know. When we are done, we do have a pretty good idea of what we did, but we still do not know whether we actually are done. We also do not know for sure all the flexibility we built into it is ever going to be needed to the extend we thought it would (acquired complexity).

There are many ways to tackle complexity, to break it down and make the goal achievable. Along the way decisions need to be made based on our understanding of the problem at that time. While at the time something sounded like a good idea, it might turn out we have piled up technical debt. We are then forced to pay back the debt by refactoring our code so that it reflects our up-to-date, most of the time much better understanding of the problem.

The question Maxim rises in his article CMS Trap (quoted at the top) is, on first glance, how to speed up the clearing of technical debt or avoid accumulating too much of it in the first place. His context is a special one, aiming at early stages in the development of a website for a start-up. But let us leave this aside for now.

Flexibility is the ability of a system to change its behavior. The quicker it can do that, the more flexible it is. While some behaviors can be easily changed, others require much more thought, thus adding complexity. Talking about something that spits out structured markup, displaying a different data point anywhere in a template is not hard. Truncating it by an arbitrary value chosen by an editor needs much more work.

There are two assumptions here. First, that there is an editor next to the developer. Second, that this editor needs control of how to display data without involving the developer.

The fundamental question is how to establish whether an assumption is valid enough to justify adding complexity.

Image by Mark Witton, CC-BY-NC-SA.

*) While Maxim’s article might be a good read for some, I think I should mention that I do not share his point of view. Nevertheless, it made me think. So, thank you Maxim :-)

Sony to create adult live streaming section

What happened when Sony opened its PlayStation Live Stream service (using Twitch and Ustream)?

Twitch had to enforce their “gaming content only” policy because people broadcasted stripshows directly from their living room. (See Kotaku for in detail coverage.)

Dear Sony. You give millions of people around the globe a decent camera&microphone and honestly expect they  share videos of them playing Knack all day?

Create an adult section, charge 5, heck 10 bucks a month to subscribe and enjoy the additional cash.

Sorry about the headline.