Sunday, April 23, 2017

The Next 100 Years: A Forecast for the 21st Century, by George Friedman

book cover This short but compact book is about all the stuff that I am really weak on: geography, history, politics and economics. Yeah, like everything remotely related to the real world. Yet I actually liked it a lot, especially the first half. In The Next 100 Years, George Friedman submits his thesis that geopolitics is what shapes history and that countries are pretty much locked in patterns that they cannot escape. Using this theory, he attempts to predict events in the twenty first century. While he starts with comparing historical expectations and predictions with reality twenty years later and finding they are completely different, the author concludes that some things factor too heavily in the long run to not predictably shape the direction of history.

What I liked about the book was how easily, cynically and depressingly he describes the underlying reasons for stuff that has happened, things that are published and marketed as triumphs of humanity or struggles of heroes and that, really, were quite unavoidable. An example is female emancipation. As having 8 children per family (which was the norm in the 19th century!) became unprofitable - as children were less needed as unskilled labor and getting them to school to give them skills was expensive - and as child mortality fell and as life expectancy grew, families started having fewer children. That meant a woman would not have to dedicate her life to breeding and raising children. With that much extra time, it became unavoidable that they would do something with it. Same root for changes in family structure, violent splits between conservatives and progressives and the contraction of religious power, which is trapped in defending unsustainable societal models, like keeping women as household administrators.

Same thing about the way countries react, why the U.S.A. became the global power that will shape this century and how the actions of the great actors in global politics, sometimes appearing as chaotic or insane, are very clear and even predictable once one determines the real goal behind those actions. As the leading global power, the United States of America doesn't need to win any wars, for example, just keep the other major players locked in situations of local crisis. Ending terrorism or bringing peace and stability all over the globe, while declared as the goals of international policy, are against American interests.

I was saying that I enjoyed the first half of the book more, because it was more theoretical and more broad. Some interesting predictions there, such as Turkey, Poland and Mexico becoming important actors in the political conflicts in this century. The second half goes into economics and becomes really American centric, using concepts that are only familiar to one that lives in that economy. Yet it describes some interesting dynamics, such as predicting that in the 2030s immigration will not only not be a problem anymore, but something leading nations will compete for, as working population decreases.

I can't possibly comment on the veracity of the book's predictions or on the validity of the author's methods, as I am a complete noob in any of the fields required to analyse this book. I can tell you that Friedman expected much more to have happened in the late 2010s than it actually happened. Also, I did a quick google search to look for opinions and I will detail them below. Yet first I will summarize the book as I see it.

Friedman asserts that the US will be the pivot around which all history will revolve in the twenty first century. It had the largest military force, it controls both oceans with its powerful navy, effectively dictating who can or cannot use it to transfer resources, and even if a heavy importer of resources, it had enough of its own. Moreover, its territory is unassailable by land. Other players will attempt to balance that power, such as the European Union, Russia, the Muslim nations or the South Asian nations, while the US will actively work to destabilize them so that they never get there. Europe is pretty much over, though, decadent and divided. China will fail economically, then split into regional powers easily manipulated against Chinese stability. Also, constrained by history and culture, Japan, China, Korea will never be able to effectively work together to create a regional power stable and strong enough to balance the US. The only country that can do anything to rally Muslim countries around it is Turkey, the rest just fight among themselves and whenever one manages anything, America pulls the carpet from under its feet. Yet Turkey is a secular country so far and even with the strong hand of Erdogan it will never convince countries that are essentially religious barbarians. Last and most important, Russia, which will also fail because it relies too much on its hydrocarbon exports, something that the US will subvert by investing heavily in alternative energy sources and is surrounded by countries that will never fully come under Moscow control, with the US always encouraging the opposite. He then asserts that by 2080 Mexico will emerge as a competitor to the US.

Many of the comments online mention the shock value of dismissing European or Chinese importance in the world in the long term. Personally I feel that if the US will be such a comfortable global power, the world is going to be boring at the very best and probably really sad. Friedman himself wrote another book two years later called The Next Decade: What the World Will Look Like, in which he expresses his fear that so much power will corrupt the very foundation of the nation and turn republic into empire. Many feel that this century is so much different than any other and cannot be accurately predicted by looking back to history. "break it or make it" century, they call it. Let's hope it's the latter.

Other criticism goes towards the singular focus of the book on geopolitics and less on economics, technology, religion or culture. I believe he did that for shock value, also, trying to pull people into the discussion by underestimating or even completely ignoring things with so much emotional value for a lot of people. He basically said "the world works like this, not like you would like it to work, deal!", then waited for the comments. A bit trollish, Friedman is.

Finally, the more military oriented criticize Friedman for relying too much on conventional military paradigms and ignoring space warfare, WMDs and the informational angle. I can only consider this as a stabilizing force, rather than a destructive one. If the possibility that a pissed off enough player might destroy the whole board exists, then the actions of all the players will be more subdued than possible. Warfare was made more subtle, not more unpredictable, by this type of possibility.

Bottom line: an eye opening book, more valuable for its concise analysis of global history than its predictions, probably, and for explaining why so many countries behave like idiots. In the end, the very purpose of the book, that of predicting this century, is made moot by its thesis that it can be predicted. If that is so, then whatever happens happens no matter what anyone does. Also, I believe it is great material for fiction writers that want to ground their universes into reality. While the predictions themselves, either wrong or spot on, are irrelevant, the method for their creation is most interesting and worth investigating. I mean, George Friedman is not Hari Seldon, but he is the closest we've got.

George Friedman has a lot of video talks and interviews detailing his views. Once can easily look them up online.

Friday, April 21, 2017

Athens in the spring

I've just returned from vacation in Greece where I spent about three days in Athens, the country's capital. It was an interesting experience, mostly because it felt so depressingly familiar, but also because it showed both promise and disappointment at the same time.

General impressions

I would have liked Athens to look like this (click to enlarge):

An orange tree in bloom, smelling wonderful, with a lazy cat comfortably laying at its base, not a care in the world.

Instead, it was mostly like this:

A nice little building with traditional Greek balconies on a cozy street (with the mandatory orange trees), next to a derelict ruin covered in graffiti and garbage.

Indeed, for me Athens felt downright schizophrenic. The day we arrived we went by foot towards the very center of the city, the place where tourists go for expensive dinners in an area filled with restaurants. We had to walk on streets covered in garbage, populated by vagrants, or places where the only companies were Chinese import companies and the street was filled with dirty dark skinned people doing suspicious commerce. And no, I am not afraid of dark skinned people, but I was accompanying two women and I was damn nervous. Anyway, all this was not a shady part of the city, instead these areas were intermingled with lighted streets where restaurants and tourists were abundant. And in the less tourist part of the center of Athens there was the same story, only told by buildings. Prosperous companies housed next to unfinished, abandoned or really ugly constructions. It wasn't uncommon to see a whole first floor with the windows empty or barred with wood, with shops on the first floor. Vagrant people everywhere, and they didn't look like Syrian migrants, either. Only they didn't seem that violent and people walked around them as it was the most natural sight in the world.

It reminded me so much of Bucharest, only here the restaurants and shops are less expensive and the vagrants less common. In Romania, that kind of lack of social status and resources often breeds frustration, anger and violence. Police actively try to get rid of homeless people. In Athens it looked as if this mix of opulence and filth was a given. The traffic in Athens also reminded me of my home city, only again, it felt more extreme and more subdued at the same time. It was common to see people cross the street in the middle of the boulevard, cars and motorbikes rushing towards them, with not a hint of fear. Cars would stop and let them pass, the drivers obviously not happy, but not expecting different either. The behavior was validated by the crazy street lights that turned green then back to red in a matter of seconds on some large streets and boulevards, and stayed green a long time then went intermittently green before changing color on small and barely circulated roads. The "hop on, hop off" bus, nicely avoiding the streets where people were sleeping on the ground and keeping on the nice sides of the city, a double decker vehicle filled with people, would routinely stop in its course to wait for taxi cabs that would converse with customers or people randomly parking cars or scooters or whatever in the middle of the road. Again, in my home city this happens all the time, but people are angry with frustrated honking and loud swearing. The driver seemed quite calm driving through spaces that barely allowed the bus to pass or to wait for these people.


Anyone who knows me also knows that I rate the quality of a place based on the taste of the local food. I could be surrounded by black beggars and still enjoy a good meal (as long as I don't have to smell people). However, the types of places you can eat at in Athens are also quite different. One can follow the TripAdvisor recommendations and find themselves paying 6 euros for a beer in an Irish café and eat crap for a lot of money; that if they even find the place there anymore, since it seems that the economics of the area are quite dynamic. One can go to where most people seem to go, and end up in a typical tourist trap tavern that gives them a Greek-style euro-food that doesn't mean anything, tastes like anything else and, again, is expensive as shit. Yet there is also the possibility you end up at a nice Greek tavern or some other type of place where you can eat and drink and enjoy both as well as the mood and atmosphere of the place. And as with other aspects of the town, you can find these types of locales one next to the other.

For example we went to the fish and meat market. It's a huge place where people try to sell you fresh fish, mollusks, lamb, pork and so on. After walking around and frankly getting fed up with the smell of fish in the place I was about to leave when I noticed in a nook of the market, out of the way, there was a tavern. I immediately went there. I mean, if people that work there also eat there using fresh ingredients that they sell there, it's gotta be good. And it was! We ate some really inexpensive stuff, with Greeks sitting (and smoking) at the other tables, all singing together with a guy playing the accordion. And let me tell you this: the songs that they sang and knew by heart were not the type of songs that outsiders connect to "traditionally Greek", although they were obviously so. And also the accordion guy was not expecting money or anything, he was playing because he liked it. I loved the place, although it was clearly "a bomb".

Similarly we found a Greek tavern right next to some fancy "cafés" that served expensive drinks and coffee and snacks that were barely food. We had moussaka and Greek salad with retsina and tsipouro and it was wonderful. We were slightly interrupted by some child beggars; they were Romanian.

Amazingly enough, I had no souvlaki, not for lack of trying, but because I was there with evil women who seemed bent on wearing my legs off with their damn walking and sightseeing! Also I was really attracted by some Indian and Bangladeshi places that seemed even more "explosive" than the market tavern. Yet they were in the area with all the beggars and import companies and I couldn't convince anyone to go there. I would have chanced it, maybe, if it wasn't that I had to fly to Bucharest the next day and going to the bathroom every half an hour would have been kind of difficult.

The Akropolis

It's a bunch of ruins on top of a mountain, infested by tourists and quite frankly mostly fake. I mean, the Akropolis museum is much more interesting and it also shows how many times the Akropolis was damaged, destroyed, raised and restored afterwards. To me, the picture there I felt the most true is this:

A mass of indistinct people sucking away any trace of tradition, history or sacred from a bunch of replica stones and statues that need heavy machinery to even stay in place.

For a moment I imagined they were installing the machines in order to make a transformer place. One could see Akropolis in various stages of its existence: press a button and suddenly the Parthenon is a mosque from the times of Turkish occupation, and the Erechtheion is where the harem is housed, press another and you get a church of Virgin Mary.

You want more, just google it.

The people

I've seen really tall muscular Greeks and also small little dudes. It seemed like there was a gap in the middle, where an "average" Greek was less found. Girls were as a rule rather ugly, with a tendency for being short and fat. I've seen cute Greek girls, but they were all young and far in between.

As a rule they were all rather polite, although we didn't interact with a lot of them. At one point we went to a tavern and the Greek waiter there spoke some Romanian words as many of the employees were from Suceava and he caught some of the language - he seemed to be enjoying his association with Romanian people. Also, for a place filled with homeless people, Athenians didn't seem to fear theft so much. I saw people leave their bag and cell phone on the ground while they went back to their motorcycle to get something and many shops that held products outside, ripe for the grabbing.


Athens didn't feel at all like a tourist city. Like Bucharest, it has its quirks and nice places, but its pragmatic purpose is to be a capital, not a place to explore and enjoy as a tourist. After two days there you have to ask some locals what else to look for and I bet that most of them would have to think hard before coming up with an answer. The city is a lot larger than its center, and we didn't go to see it all, so there are aspects that I am sure we missed, but the little I've seen shows a place of growth that was stunted by the country's economic collapse. It is not a place that is poor or rich, but rather something that feels diseased, with healthy tissue surrounded by corrupted one. Yet is it healing or delving deeper into sickness? That I cannot tell.

What I can tell you is that I don't regret seeing it, but I wouldn't go back there. My favorite experiences were smelling the blooming orange trees and eating at the fish market. The rest felt totally forgettable.

Tuesday, April 18, 2017

The Stars My Destination, by Alfred Bester

book cover In the first pages of the book I hear about a man who discovered teleportation when he was about to die. Like telepathy, it was a mental thing and most people seemed to be capable of it for short distances. I immediately thought it was quaint and that I was going to read one of those hopelessly out of date books. Indeed, published in 1955, the book didn't age well in terms of social or technological depiction, but it was really entertaining.

First of all, with all of its flaws, this book isn't boring. Alfred Bester managed to create a true antihero as the protagonist and maintained it as far as close to the end of the book. In The Stars My Destination (also known as Tiger! Tiger!) the hero is a brute who has no ambition, no desire for improvement, content only to exist. When left for dead on a derelict space rocket, his survival instinct leads him to obsessive hate towards the ship that didn't save him and his whole life turns to revenge against it. Imagine The Count of Monte Cristo in space, where everybody can teleport at will. And he is hardcore. The first thing he does when he gets back is train his teleportation powers and, when discovered by a negro woman (in '55 it was an OK word), he blackmails her with knowledge of her family and promptly rapes her. Yup, that's the hero.

The story gets a little bit inconsistent afterwards, as this low life unambitious person suddenly is capable of immense personal change, in behavior, knowledge and social status. However to follow his mad maniacal hunt for the thing that offended him is hallucinating. The world depicted by Bester is a quasifeudal multiplanetary society led by dynasties named after the successful brands at the time, like Kodak (yeah, I know) and devoid of any social justice or human empathy. In fact, when describing the emergence of teleportation, the immediate effects he predicts are diseases spreading through the world, carried by vagrants and immigrants who just teleport out of poor and backward countries in the civilized world (I told you the book has not aged well). Yet for all of this it is a mesmerizing world.

As the book was short, I finished it in a few hours, there is no reason not to delve into something that is both incredibly quaint and amazingly refreshing. While the ending felt a bit too inconsistent with the beginning, the story was interesting and I loved the flawed and brutal character, something I doubt you will see in any mainstream story today. Like its protagonist, it is one of those works that I couldn't not like, despite their many flaws.

Monday, April 17, 2017

The Long Way to a Small Angry Planet, by Becky Chambers

book cover I think I am way too trusting of book reviews, as I was with the one from Andrew Liptak about Becky Chambers' debut book The Long Way to a Small, Angry Planet. He said it was the most fun space opera book he'd read last year and the review was published in September, so I just automatically added it to my list without reading further for fear of spoilers. It was a bad idea.

The book is the space opera version of romance novels. There is a ship run by a quirky team of multispecies crew on their way toward a planet that is to become the interface point between the galactic federation analog and a new species of warring and incomprehensible aliens. So far so good. The problem is that the book is all about the one year trip there with the focus solely on the personal histories, dramas and emotions of the crew members. They fall in love with each other, they transgress stupid specieist prejudice, love conquers all, that sort of thing. Then the book ends, with everybody nicely paired and friendly towards everyone else. That's it. I mean, that is literally it. The book ends when one is already bored to death and waiting for some interesting stuff to happen. And it never does!

While I have to admit that 50 Shades of Grey was way worse and sold a lot better, that is in no way a recommendation to read this book. It is technically obsolete, morally ridiculous and abhorrently politically correct. It becomes pathetic to see how the author attempts to justify interspecies romance and sex and to condemn biases about other sentients, even inventing new pronouns, only to fall into other more boring clichés about what our future society should be like and how people should behave with one another and how nasty people look (unkempt fair skinned dark haired antisocial techs or chitinous insect like creatures) and how inner beauty translates to outer beauty and the making of easy friends and so on. It felt like a highschool story with aliens. I mean there was a moment when their ship is boarded by pirates and the resolution is to politely talk to them and reach an agreement, because this particular species was culturally bound to only take what they need and no more. Really?

Bottom line: it is a puerile make believe fairy tale about space lovers which has almost no science in it. It is a another fantasy novel that tries to trick readers in by posing as science-fiction, only it is a romance fantasy novel. And if you are eager to read passages of interracial space sex, forget it, the most controversial word you will see printed in this book is "banging" and I believe it is used but once.

Saturday, April 15, 2017

Finity's End (The Company Wars #7), by C.J.Cherryh

book cover I don't remember where I got the idea to read Finity's End, but I had no idea it was one book in a long series of Merchanter Aliance/Union books. So at the same time I liked that it started with a rich world and a very clear idea where it was coming from and going to, but also felt the missing information that I would have had if I had read the series from the beginning.

That being said, I have to say I liked the book. I found C.J.Cherryh's world building extraordinary and considering she has written more than 60 books, her attention to detail and the way things click was really nice. Yet at the same time, this made the story slower, more grounded in a larger reality that didn't really interest me.

The story of Finity's End is of a young boy who wants to live on a planetary station and study the intelligent indigenous life with which he made the only personal connection he values. Instead, he is being pulled up to a spaceship he doesn't care about for the only reason that his mother was originally born there. Faced with prejudice and culture shock, his story of adapting to the new environment is a lot more interesting that the political and economical dealings the ship is engaged with and often I felt that, while I appreciated the realism of the plot, I couldn't wait to get to the personal part, with the characters I really cared about. Perhaps if I had read the entire series I would have felt more connected to older characters and less to this adolescent coming of age subplot.

I found it interesting the author's description of two entwined cultures that live in different space and even different time: there are the planet and station folk and then there are the space ship people. They interact, but for most of their existence they are almost different species. Perhaps if the viewpoint on this was more modern I would have enjoyed it more, but as such, it felt like a book written in the 70s.

Bottom line: I liked the book, but not so much that I will start reading the others in the series. I liked the world building and I loved the way the characters subtly evolved in their interaction with each other. I didn't really connect with the larger plot of trying to bring peace to the galaxy and I felt nothing towards the Mazian boogieman who never made an appearance in this book. While I believed the possibility of a future like that, the technological part of the book felt quite obsolete to me, even if the publishing date for Finity's End is 1997.

Using multiple projects in Visual Studio Code

A reader asked me how to work with multiple projects in Visual Studio Code and after fumbling a little I realized I had no idea. So I started trying out things.

First I created a folder in which I created two other folders ingeniously named Proj1 and Proj2. I went in both and ran dotnet new console and dotnet new classlib respectively. I moved the Console.WriteLine("Hello World!"); code from the console Program.cs in a static method in the Class1 class, then called the method from Program.cs Main, then tried to find ways of referencing Proj2 from Proj1 in Visual Studio Code.

And here I got stuck. I tried the smart solutions VS Code recommended, but none of them included adding a reference. I right clicked on everything to no avail. I wrote using Proj2; by hand hoping that Code will magically understand I need a project reference. I googled, only to find old articles that discussed project.json, not .csproj type of .NET projects.

In the end I was resigned to write the reference by hand. I opened Proj1.csproj and added
  <ProjectReference Include="..\Proj2\Proj2.csproj"/>

After saving the file and going to the unresolved Class1 reference, I now got using Proj2; as an option to fix it. And now I got to the problem my reader was having. When trying to run Proj1, I got Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly 'Proj2, Version=, Culture=neutral, PublicKeyToken=null'. The system cannot find the file specified. at Proj1.Program.Main(String[] args).

It's disgustingly easy to solve, you just need to know what to do. Either Ctrl-Shit-P and type restore, then select restoring Proj1 or do it manually by going to the Proj1 folder and running dotnet restore by hand. After that the project is compiled and runs.

  1. add project reference by hand to .csproj file
  2. resolve whatever compilation errors you have by specifying the correct usings or inlining namespaces
  3. dotnet restore the project you added references to

Thursday, April 13, 2017

Lambdas, LInQ, Javascript and so on

As a .NET developer I am very familiar with LInQ, or Language Integrated Queries, which is a collection of fluent interface methods that deal with querying data from collections. However, so many people outside the .NET ecosystem are not familiar with the concept or they use it as disparate functions in their language of choice. What makes it even more confusing is that the same concept is implemented in other languages with different names. Let me give you an example:
var arr=[1,2,3,4,5,6];
var result=arr
  .Where(v=>v%2==0) //get only even values
  .Select(v=>v*10)  //return their values multiplied with 10
  .Aggregate(15,(s,v)=>v+s); //aggregate their value into a sum that starts with a seed of 15
// result should be 15+2*10+4*10+6*10=135

We see here the use of three of these methods:
  • Where - filters the values on a condition
  • Select - changes the values it returns
  • Aggregate - creates an aggregate value using an operation on all the values in the collection

Let me write you the same C# code without using these methods:
var arr=[1,2,3,4,5,6];
var result=15;
foreach(var v in arr) {
  if (v%2==0) {

In this case, some people might prefer the second version, but it is only an example. LInQ is not a silver bullet that replaces all loops, only a tool amongst many in a large toolset. Some advantages of using such a method are concise code, better readability, a common API for iterating, filtering and querying collections, etc. For example in the largely used Entity Framework or its previous incarnations such as Linq over SQL, the queries would look the same, but they would be translated into SQL and sent to the database and executed just once. So it would not get a list of thousands of records to filter it in memory, instead it would translate the expression of the function sent to the query into SQL and execute it there. The same sort of operations can be used on streams of data, rather than fixed collections, like in the case of Reactive Extensions.

Some other methods in this set include:
  • First/Last - getting the first or last element in an enumerable that satisfies a boolean condition
  • Skip - ignoring a number of values in a collection
  • Take - returning a number of values in a collection
  • Any/All - returning true if at least one or all of the items satisfy a boolean condition
  • Average/Sum/Min/Max - specific aggregating methods for the elements in the collection
  • OrderBy/OrderByDescending - sorting
  • Count - counting

There are many others, you can look them up here.

Does this system of querying data seem familiar to you? To SQL developers it will feel second nature. In SQL the same result from above would be achieved by using something like:
SELECT 15+SUM(SELECT v*10 FROM table WHERE v%2=0)

Note that other than putting the source of the data in front, LInQ syntax is almost identical.

However, in other languages this sort of data query is called map/reduce and in fact there is a very used programming model called MapReduce that applies in big data processing. In Java, the function that filters data is called filter, the one that alters the values is called map and the one that aggregates data is called reduce. Similar in Javascript. Here is the same code in Javascript:
var arr=[1,2,3,4,5,6];
var result=arr
  .filter(v=>v%2==0) //get only even values
  .map(v=>v*10)  //return their values multiplied with 10
  .reduce((s,v)=>v+s,15); //aggregate their value into a sum that starts with a seed of 15
// result should be 15+2*10+4*10+6*10=135
Note that the lambda syntax of writing functions used here is new in ECMA Script version 6. Before you would have to use the function(x) { return [something with x]; } syntax.

In Haxe, the concept is achieved by using the Lambda library and the functions are again named differently: filter for filtering, map for altering and fold for aggregating.

There is another sort of people that would instantly recognize this model of data querying: functional programming people. Indeed, SQL is a functional programming language at its core and the same standard for data querying is used very efficiently in functional programming languages, since they know whether a function is pure or not (has side effects). When only dealing with pure functions, some optimizations can be made on the query by the compiler before anything is even executed. Haskell has the same naming as Haxe (filter, map, fold) for example.

So whenever I get to review other people's code, especially people that have little experience with either SQL or C#, I cringe to see stuff like this:
var max=-1;
for (var i=0; i<arr.length; i++) {
  if (max<arr[i]) max=arr[i];
In my head this should be simply arr.max(); And considering how easy it is to implement something like this in Javascript, for example, it's a crime for not using it:
Array.prototype.max=function() { return Math.max.apply(null,this); }

Yet there is more to this than my personal preference for reading code. Composition, for example. Because this works like a fluent API or a builder pattern, one can keep adding conditions to a query. Imagine you have to filter a list of strings based on a Google like query string. At the very minimum you would need to split the query into strings and filter repeatedly on each one. Something like this:
var arr=['this is my special query string','this is a string','my query string is this awesome','no query strings here, move along','these are not the strings you are looking for'];
var query="this is a query string";
var splits=query.split(/\s+/g);
var result=arr;

There is a lot of stuff I could be saying about this subject, but I need to summarize it. It's all about inverting loops. Instead of having to go through a collection, a stream or some other data source, then executing some code for each element, this method allows you to encapsulate the operations you want to execute on those elements, pass them around, compose them, translate them, then use them on any data source in the same way. A common API means reusability, better readability of code, less written code and a simpler declaration of intent. Because we get out of the loop system, we can expand the use for other paradigms, such as infinite data streams or event buses.

Fear and Hope

It occurred to me recently that the opposite of fear is hope. Well, of course, you will say, didn't you know that? I did, but I also didn't fully grasp the concept. It doesn't help that fear is considered an emotion, yet hope a more complicated idea.

I was thinking about the things that go wrong in my country and some of it, a large part, comes from bad laws. And I was trying to understand what a "bad law" is. I tried some examples, like the dog leash one - I know, I have a special personal hate for that one in particular - but I noticed a pattern. It's not about the content of the law as it is about its trigger. You see, lawmen don't propose and pass laws because they like work, but because there was an event that triggered the need for that law. Law is always reactive, not proactive. It could be proactive, but there is a lot more effort involved, like convincing people that there is an actual problem that needs addressing. It's much easier to wait for the problem to manifest and then try (or pretend) to fix it.

Anyway, the pattern that I noticed was related to the trigger for individual laws. The bad laws were the ones that came out of fear. One kid got killed by stray dogs, kill them all and institute mandatory leashes on pets. The good laws, on the other hand, come from hope. Lower taxes so people are more inclined to work and thus produce more and so get more tax in. Hopefully people will not be lazy.

And it's not only related to laws, but to personal decisions as well. Will I try a new thing, hoping that it will make me better, teach me something, be fun, or will I not try it because it is dangerous, somebody might get hurt, I may lose precious time, etc? When it is so abstract it's almost a given that you will take the first choice, yet when it is more personal fear tends to paralyze.

Fear is also contagious. The people who want us to be afraid are afraid themselves. Control freaks, power hungry people, they don't want to take us to a better place because they are afraid to lose that control, because they are afraid of what might happen. And their toolkit is based on fear, too. Something exploded and killed people, some asshole drove a car into people: we must ban explosives, cars and - just to be safe - people. Don't go to space because people might die, although they die every second and most of the time you don't care about it. Let's hoard money and things because we might not get another chance to have them, because we might lose them, because we are so afraid. The fear people don't know any other language but fear and they will use it against you. Much easier to instill fear than to give hope, so hope is not that contagious. It is fragile and it is precious.

I submit that while fear might keep us safe it will never make us happy. The very expression "to keep safe" implies stagnation, keeping, holding, controlling, restricting freedom.

So here is my solution. As Saint-Exupery said, perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. Let's strictly define our safe zone, or the area we need to be safe in order to not be afraid. Personally, as a group, as a country, as a planet, let's set the minimum requirements to being safe, a place or situation we can always retreat to and not be afraid. Whether it is a place that is your own, or a lack of debt, or a job or business that will give you just enough money to survive and not spiral out of control, a relationship or some other safety net, everyone needs it. But beyond it, let's abandon fear and instead use hope. Hope that you can do more, you can be better, you can live more or have fun, that other people will act good rather than badly, that strangers will help rather than harm you, that the unknown will reveal beauty rather than terror.

I will choose to define good decisions as coming from hope. Will that hope be proven to be unfounded? Maybe. But a decision based on fear will never ever be good enough. And if all else fails, I have my safe zone to get back to. And I know, I very much know that having a place to get back to from failure is a luxury, that not many people have it as good as I do, but to have it and still live in fear, that's just stupid.

Saturday, April 08, 2017

Song of Kali, by Dan Simmons

book cover A friend of mine recommended this as one of his favorite books, so of course I went into it with very high expectations and of course I was disappointed. That doesn't mean it's a bad book, just that I expected more than I've got.

In Song of Kali, Dan Simmons describes Calcutta as a place of evil, in a culture of filth and senseless violence and death. He goes there with his Indian wife and their infant child when he is called to retrieve a new manuscript from a supposedly dead Indian poet. A lot of culture shock, a lot of weird mystical events and some weird and horrible people that do horrible things is what the book is about.

In 1985 this was perhaps a fantastic story, I don't know, but now it feels a little bit cliché: American man goes somewhere he sees as completely alien and where he feels out of place, usually going there with the family, so that the empathy and horror can be heightened, and where abnormal things he has no control over happen. It also part of a category of stories that I personally dislike: the "something that can't be explained or controlled" category, which implies absolutely no character growth other than realizing there are situations like that in which one can find themselves. And indeed the book is all like that: stories that make little sense, but somehow are linked to the perceptions and experiences of the protagonist, mysterious characters that do things that mean little unless the story takes them exactly to a certain point, at which you are left wondering how did they know to do that thing, and a lot of extraneous details that are there only to reinforce the feeling of disgust and dread that the character feels, but do little to further the story.

In the end, it is just some weird ass plot that makes no sense, a bunch of characters that you can't empathize with (some of them you can't even understand) and a big fat "It is so because I feel it is so", which is so American and has little to do with me. Others agree that the book is most effective when describing the humid fetid heat of the city and the inhumanity of its inhabitants and less with the so called "horror" in the text or the connection the reader feels with the characters. It brings to mind Lovecraft and his strong feelings about things that now are banal and CGI in every movie. Some are even more vehement in their dislike of the book. Here is another review in the same vein.

So how come so many people speak highly of the novel? Well, my guess is that it affects the reader if they are in the right frame of mind. My friend told me about the part that he liked in the book and, frankly, that part is NOT in the book, so whatever literary hallucination he had when reading the book I had none of it. My rating of it cannot be but average, even considering it's a debut novel that won the 1986 World Fantasy Award.

Saturday, April 01, 2017

Breaking Bad, the Movie!

So I heard that there is this fan made cut of the series, two hours long, that encompasses the entire story of Breaking Bad. I got a hold of it and watched it. Pretty good. Just some lazy editing in some places, but overall good quality. Therefore, if you want to see what happened in the series overall, without bingeing on 62 hours of TV show, you might want to check it out.

My problem with the film is that it validated my decision to stop watching the series. It focused primarily on Walter's decision points, which were mostly related to his problems with his family (mostly his bitch of a wife that I believe is one of the most irritating characters of all time), friends and coworkers. The only part that I really enjoyed about the series was the first season, where there was actual chemistry involved. Just like other shows that start off with a brilliant specialist that is rather annoying otherwise but gets away with it because he is a flawless craftsman, it begins great then devolves in stories about his personal life. Why would anyone want to for years follow the personal issues of someone who they only became interested in because of their work story is beyond me, but this is what happens. Dr. House, Numb3rs, Elementary, Weeds, even lawyer, doctor and cop shows slowly force their heroes to stop doing their work and instead deal with all kinds of problems in their off duty life; they all lose me at season two, usually. Because of this focus on personal life, the chemistry part got removed from the movie, which makes is all the more boring.

Anyway, my duty is complete on informing the Internet on this film. It is amazing how people spend their time doing something like this for nothing as much as recognition (because then lawyers would bust their chops about using copyrighted content), but do such a lovely job. I would love to have this sort of edits for every show on the planet. Then I would be able to keep up with all of them! :D Also interesting is that there is an IMDb page for the movie found by Google, but then when you navigate to it you get a big 404 page, meaning someone probably created it and then it promptly got deleted. Even if illegal, it is still a movie, assholes! Here is the Google cached version, for how long it will work.

And BTW, if you want to still write a review on the movie, as deleted as it is, you can do so by following this link. Maybe that will force the guys from IMDb to undelete the page.

Tuesday, March 28, 2017

Fevre Dream, by George R. R. Martin

book cover In Fevre Dream, George R. R. Martin writes about a fat bearded guy with a large appetite and a passion for food that loves to be a boat captain. Write what you know, they say. Anyways, this book about vampires in the bayou feels really dated. It has been described as "Bram Stoker meets Mark Twain", so you can imagine how much; written in 1982, it feels like written by a Lovecraft contemporary.

I love Lovecraft, but it gets worse. None of the characters in the book except maybe the main protagonist are likable. They come off as either high and mighty or ridiculously servile. And I understand that in a story where vampires have a master that can be all controlling this is to be expected, but at the same time the hero of the story, without being "compelled", still acts like a servant, enthralled (pardon my pun) by the aristocratic majesty of his vampire friend. One has to get through pages of tedious description of architecture and food and home improvement to get to the succulent part (OK, couldn't help that one) but which then feels cloyed and unsatisfactory. So many interesting characters get just a few scenes, while most of the book is how much the captain loves his food and his ship. And while it discusses some social issues, like slavery and how easily people died or disappeared at the time, it also promotes this idea of personal nobility that justifies other people getting used. This focus on aristocracy is something one sees in A Song of Fire and Ice as well, but less pronounced.

I could have given it an average to good rating if not for the abysmal ending. While at the beginning I had applauded the way the author was building tension and apparently providing a solution only to snatch it away at the last moment, the ending destroys all of it by pretty much invalidating much of the foil of the characters and a major part of the story. The time displacement also accentuates this feeling, as I thought "waited so much for this?!", and by that I mean both me as a reader and the main character in the book.

Bottom line: uninteresting vampires in a slow paced story that probably appeals to Martin fans only. It manages to insert the reader in the eighteen hundreds and the river boat mentality, but there is nothing much else to learn or enjoy in the book beyond that.

Sunday, March 19, 2017

Learning ASP.Net MVC - Part 6 - Dependency Injection and Services

Learning ASP.Net MVC series:
  1. Setup
  2. MVC Concepts
  3. Authentication
  4. Entity Framework Fundamentals
  5. Upgrading project to .NET Core 1.1
  6. Dependency Injection and Services

Previously on Learning ASP.Net MVC...

Started with the idea of a project that would use user configurable queries to do Google searches, store the information in the results and then perform various data analysis on them and display them based on what the user wants. However, I first implemented a Google authentication and then went to write some theoretical posts. Lastly, I've upgraded the project from .NET Core 1.0 to version 1.1.

Well, it took me a while to get here because I was at a crossroads. I like the idea of dependency injectable services to do data access. At the same time there is the entire Entity Framework tutorial path that kind of wants to strongly integrate EF with my projects. I mean, if I have a service that gives me the list of all items in the database and then I want to get only a few items, it would be bad design to filter the entire list. As such, I would have to write a different method that allows me to get the items based on some kind of filters. On the other hand, Entity Framework code looks just like that "give me all you have, filtered by this" which is then translated into an efficient query to the database. One possibility would be to have my service return IQueryable <T>, so I could also use the system to generate the database code on the fly.

The Design

I've decided on the service architecture, against an EF type IQueryable way, because I want to be able to replace that service with anything, including something that doesn't work with a database or something that doesn't know how to dynamically create queries. Also, the idea that the service methods will describe exactly what I want appeals to me more than avoiding a bit of duplicated code.

Another thing to define now is the method through which I will implement the dependency injection. Being the control freak that I am, I would go with installing my own library, something like SimpleInjector, and configure it myself and use it explicitly. However, ASP.Net Core has dependency injection included out of the box, so I will use that.

As defined, the project needs queries to pass on to Google and a storage service for the results. It needs data services to manage these entities, as well as a service to abstract Google itself. The data gathering operation itself cannot be a simple REST call, since it might take a while, it must be a background task. The data analysis as well. So we need a sort of job manager.

As per a good structured design, the data objects will be stored in a separate project, as well as the interfaces for the services we will be using.

Some code, please!

Well, start with the code of the project so far: GitHub and let's get coding.

Before finding a solution to actually run the background code in the context of ASP.Net, let's write it inside a class. I am going to add a folder called Jobs and add a class in it called QueryProcessor with a method ProcessQueries. The code will be self explanatory, I hope.
public void ProcessQueries()
    var now = _timeService.Now;
    var queries = _queryDataService.GetUnprocessed(now);
    var contentItems = queries.AsParallel().WithDegreeOfParallelism(3)
        .SelectMany(q => _contentService.Query(q.Text));

So we get the time - from a service, of course - and request the unprocessed queries for that time, then we extract the content items for each query, which then are updated in the database. The idea here is that, for the first time a query is defined or when the interval from the last time the query was processed, the query will be sent to the content service from which content items will be received. These items will be stored in the database.

Now, I've kept the code as concise as possible: there is no indication yet of any implementation detail and I've written as little code as I need to express my intention. Yet, what are all these services? What is a time service? what is a content service? Where are they defined? In order to enable dependency injection, we will populate all of these fields from the constructor of the query processor. Here is how the class would look in its entirety:
using ContentAggregator.Interfaces;
using System.Linq;

namespace ContentAggregator.Jobs
    public class QueryProcessor
        private readonly IContentDataService _contentDataService;
        private readonly IContentService _contentService;
        private readonly IQueryDataService _queryDataService;
        private readonly ITimeService _timeService;

        public QueryProcessor(ITimeService timeService, IQueryDataService queryDataService, IContentDataService contentDataService, IContentService contentService)
            _timeService = timeService;
            _queryDataService = queryDataService;
            _contentDataService = contentDataService;
            _contentService = contentService;

        public void ProcessQueries()
            var now = _timeService.Now;
            var queries = _queryDataService.GetUnprocessed(now);
            var contentItems = queries.AsParallel().WithDegreeOfParallelism(3)
                .SelectMany(q => _contentService.Query(q.Text));

Note that the services are only defined as interfaces which we declare in a separate project called ContentAggregator.Interfaces, referred above in the usings block.

Let's ignore the job processor mechanism for a moment and just run ProcessQueries in a test method in the main controller. For this I will have to make dependency injection work and implement the interfaces. For brevity I will do so in the main project, although it would probably be a good idea to do it in a separate ContentAggregator.Implementations project. But let's not get ahead of ourselves. First make the code work, then arrange it all nice, in the refactoring phase.

Implementing the services

I will create mock services first, in order to test the code as it is, so the following implementations just do as little as possible while still following the interface signature.
public class ContentDataService : IContentDataService
    private readonly static StringBuilder _sb;

    static ContentDataService()
        _sb = new StringBuilder();

    public void Update(IEnumerable<ContentItem> contentItems)
        foreach (var contentItem in contentItems)

    public static string Output
        get { return _sb.ToString(); }

public class ContentService : IContentService
    private readonly ITimeService _timeService;

    public ContentService(ITimeService timeService)
        _timeService = timeService;

    public IEnumerable<ContentItem> Query(string text)
        yield return
            new ContentItem
                OriginalUrl = "http://original.url",
                FinalUrl = "https://final.url",
                Title = "Mock Title",
                Description = "Mock Description",
                CreationTime = _timeService.Now,
                Time = new DateTime(2017, 03, 26),
                ContentType = "text/html",
                Error = null,
                Content = "Mock Content"

public class QueryDataService : IQueryDataService
    public IEnumerable<Query> GetUnprocessed(DateTime now)
        yield return new Query
                Text="Some query"

public class TimeService : ITimeService
    public DateTime Now
            return DateTime.UtcNow;

Now all I have to do is declare the binding between interface and implementation. The magic happens in ConfigureServices, in Startup.cs:
services.AddTransient<ITimeService, TimeService>();
services.AddTransient<IContentDataService, ContentDataService>();
services.AddTransient<IContentService, ContentService>();
services.AddTransient<IQueryDataService, QueryDataService>();

They are all transient, meaning that for each request of an implementation the system will just create a new instance. Another popular method is AddSingleton.

Using dependency injection

So, now I have to instantiate my query processor and run ProcessQueries.

One way is to set QueryProcessor as a service. I extract an interface, I add a new binding and then I give an interface as a parameter of my controller constructor:
public class HomeController : Controller
    private readonly IQueryProcessor _queryProcessor;

    public HomeController(IQueryProcessor queryProcessor)
        _queryProcessor = queryProcessor;

    public IActionResult Index()
        return View();

    public string Test()
        return ContentDataService.Output;
In fact, I don't even have to declare an interface. I can just use services.AddTransient<QueryProcessor>(); in ConfigureServices and it works as a parameter to the controller.

But what if I want to use it directly, resolve it manually, without injecting it in the controller? One can use the injection of a IServiceProvider instead. Here is an example:
public class HomeController : Controller
    private readonly IServiceProvider _serviceProvider;

    public HomeController(IServiceProvider serviceProvider)
        _serviceProvider = serviceProvider;

    public IActionResult Index()
        return View();

    public string Test()
        var queryProcessor = _serviceProvider.GetService<QueryProcessor>();
        return ContentDataService.Output;
Yet you still need to use services.Add... in ConfigureServices and inject the service provider in the constructor of the controller.

There is a way of doing it completely separately like this:
var serviceProvider = new ServiceCollection()
    .AddTransient<ITimeService, TimeService>()
    .AddTransient<IContentDataService, ContentDataService>()
    .AddTransient<IContentService, ContentService>()
    .AddTransient<IQueryDataService, QueryDataService>()
var queryProcessor = serviceProvider.GetService<QueryProcessor>();

This would be the way to encapsulate the ASP.Net Dependency Injection in another object, maybe in a console application, but clearly it would be pointless in our application.

The complete source code after these modifications can be found here. Test the functionality by going to /test on your local server after you start the app.

Tuesday, March 14, 2017

The Call, by Peadar Ó Guilín

book cover I've seen several very positive reviews of The Call, by Peadar Ó Guilín so I started reading it. A few hours later I had finished it. It was good: well written, with compelling characters, a fresh idea and a combination of young adult and body horror mixed with Irish mythology that hooked me immediately. I was sorry it had ended and simultaneously hoped for and cursed the idea of "trilogizing" it.

So the book follows this girl who can't use her legs because of polio. She is a happy child until her parents explain to her the realities: Ireland is separated from the world by an impassible barrier and the Aes Sidhe, the Irish fairies, are kidnapping each adolescent kid once, hunt them and hurt them in horrific ways, as revenge for the Irish banishing them to a hellish world. When "the call" comes, the child disappears, leaving back anything that is not part of their bodies and returns in 184 seconds. However, they experience an entire day in the colorless, ugly and cruel world of the Sidhe where they have to fight for their lives. In response, the Irish nation organizes in order to survive, with mandatory child births and training centers where teens are being prepared for the call in hope they will survive.

One might think this is something akin to young adult novels like The Maze, but this is much better. The main character has to overcome her disability as well as the condescending pity or disgust of others. She must manage her crush on a boy in school as well as the rules, both societal and self imposed, about expressing emotion in a world where any friend you have may just disappear in front of you and returned a monster or dead. Her friends are equally well defined, without the book being overly descriptive. The fairies have the ability to change the human body with a mere touch, so even the few kids who survive returned mentally and bodily deformed. The gray world itself is filled with horrors, with an ecosystem of carnivorous plants and animals that are actually made of altered humans, from hunting dogs and mounts to worms and spiders which somehow still maintain some sort of sentience so they can feel pain. I found the Aes Sidhe incredibly compelling: they are incredibly beautiful people and are full of joy and merriment, even as they maim and torture and kill and even when they are themselves in pain or dying, a race of psychotic vengeful people that know nothing but hate.

So I really liked the book and recommend it highly.

Monday, March 13, 2017

When refactoring is bad.

People who know me often snicker whenever someone utters "refactoring" nearby. I am a strong proponent of refactoring code and I have feelings almost as strong for the managers who disagree. Usually they have really stupid reasons for it, too. However, speaking with a colleague the other day, I realized that refactoring can be bad as well. So here I will explore this idea.

Why refactor at all?

Refactoring is the process of rewriting code so that it is more readable and maintainable. It does not mean writing code to be readable and maintainable from the beginning and it does not mean that doing it you accept your code was not good when you first wrote it. Usually the scope of refactoring is larger than localized bits of code and takes into account several areas in your software. It also has the purpose of aligning your codebase with the inevitable scope creep in any project. But more than this, its primary use is to make easy the work of people that will work on the code later on, be it yourself or some colleague.

I was talking with this friend of mine and he explained to me how, especially in the game industry, managers are reluctant to spend resources in cleaning old code before actually starting work on new one, since release dates come fast and technologies change rapidly. I replied that, to me, refactoring is not something to be done before you write code, but after, as a phase of the development process. In fact, there was even a picture showing it on a wheel: planning, implementing, testing and bug fixing, refactoring. I searched for it, but I found so many other different ideas that I've decided it would be pointless to show it here. However, most of these images and presentation files specified maintenance as the final step of a software project. For most projects, use and maintenance is the longest phase in the cycle. It makes sense to invest in making it easier for your team.

So how could any of this be bad?

Well, there are types of projects that are fire and forget, they disappear after a while, their codebase abandoned. Their maintenance phase is tiny or nonexistent and therefore refactoring the code has a limited value. But still it is not a case when refactoring is wrong, just less useful. I believe that there are situations where refactoring can have an adverse effect and that is exactly the scenario my friend mentioned: before starting to code. Let me expand on that.

Refactoring is a process of rewriting code, which implies you not only have a codebase you want to rewrite, but also that you know how to do it. Except very limited cases where some project is bought by another company with a lot more experienced developers and you just need to clean up garbage, there is no need to touch code that you are just beginning to understand. To refactor after you've finished a planned development phase (a Scrum sprint, for example, or a completed feature) is easy, since you understand how the code was written, what the requirements have become, maybe you are lucky enough to have unit tests on the working code, etc. It's the now I have it working, let's clean it up a little phase. Alternately, doing it when you want to add things is bad because you barely remember what and who did anything. Moreover, you probably want to add features, so changing the old code to accommodate adding some other code makes little sense. Management will surely not only not approve, but even consider it a hostile request from a stupid techie who only cares about the beauty of code and doesn't understand the commercial realities of the project. So suggest something like this and you risk souring the entire team on the prospect of refactoring code.

Another refactoring antipattern is when someone decides the architecture needs to be more flexible, so flexible that it could do anything, therefore they rearchitect the whole thing, using software patterns and high level concepts, but ignoring the actual functionality of the existing code and the level of seniority in their team. In fact, I wouldn't even call this refactoring, since it doesn't address problems with code structure, but rewrites it completely. It's not making sure your building is sturdy and all water pipes are new, it's demolishing everything, building something else, then bringing the same furniture in. Indeed, even as I like beautiful code, doing changes to it solely to make it prettier or to make you feel smarter is dead wrong. What will probably happen is that people will get confused on the grand scheme of things and, without expensive supervision in terms of time and other resources, they will start to cut corners and erode the architecture in order to write simpler code.

There is a system where software is released in "versions". So people just write crappy code and pile features one over the other, in the knowledge that if the project has success, then the next version will be well written. However, that rarely happens. Rewriting money making code is perceived as a loss by the financial managers. Trust me on this: the shitty code you write today will haunt you for the rest of the project's lifetime and even in its afterlife, when other projects are started from cannibalized codebases. However, I am not a proponent of writing code right from the beginning, mostly because no one actually knows what it should really do until they end writing it.

Refactoring is often associated with Test Driven Development, probably because they are both difficult to sell to management. It would be a mistake to think that refactoring is useful only in that context. Sure, it is a best practice to have unit tests on the piece of code you need to refactor, but let's face it, reality is hard enough as it is.

Last, but not least, is the partial or incomplete refactoring. It starts and sometime around the middle of the effort new feature requests arrive. The refactoring is "paused", but now part of your code is written one way and the rest another. The perception is that refactoring was not only useless, but even detrimental. Same when you decide to do it and then allow yourself to avoid it or postpone it and you do it badly enough it doesn't help at all. Doing it for the sake of saying you do it is plain bad.

The right time and the right people

I personally believe that refactoring should be done at the end of each development interval, when you are still familiar with the feature and its implementation. Doing it like this doesn't even need special approval, it's just the way things are done, it's the shop culture. It is not what you do after code review - simple code cleaning suggested by people who took five minutes to look it over - it is a team effort to discuss which elements are difficult to maintain or are easy to simplify or reuse or encapsulate. It is not a job for juniors, either. You don't grab the youngest guy in the team and you let him rearrange the code of more experienced people, even if that seems to teach the guy a lot. Also, this is not something that senior devs are allowed to do in their spare time. They might like it, but it is your responsibility to care about the project, not something you expect your team to do when you are too lazy or too cheap. Finally, refactoring is not an excuse to write bad code in the hope you will fix it later.

By the way I am talking about this you probably believe I've worked in many teams where refactoring was second nature and no one would doubt its utility. You would be wrong. Because it is poorly understood, the reaction of non technical people in a software team to the concept of refactoring usually falls in the interval between condescension and terror. Money people don't understand why change something that works, managers can't sell it as a good thing, production and art people don't care. Even worse, most technical people will rather write new stuff than rearrange old stuff and some might even take offense at attempts to make "their code" better. But they will start to mutter and complain a lot when they will get to the maintenance phase or when they will have to write features over old code, maybe even theirs, and they will have difficulty understanding why the code is not written in a way in which their work would be easy. And when managers will go to their dashboards and compare team productivity they will raise eyebrows at a chart that shows clear signs of slowing down.

Refactoring has a nasty side effect: it threatens jobs. If the code would be clean and any change easy to perform, then there will be a lot of pressure on the decision makers to justify their job. They will have to come with relevant new ideas all the time. If the effort to maintain code or add new features is small, there will be pressure on developers to justify their job as well. Why keep a large team for a project that can easily accommodate a few junior devs that occasionally add something. Refactoring is the bane of the type of worker than does their job confusingly enough that only they can continue to do it or pretend to be managing a difficult project, but they are the ones that make it be so. So in certain situations, for example in single product companies, refactoring will make people fear they will be made redundant. Yet in others it will accelerate the speed of development for new projects, improve morale and win a shit load of money.

So my parting thoughts are these: sell it right and do it right! Most likely it will have a positive effect on the entire project and team. People will be happier and more productive, which means their bosses will be happier and filthy richer. Do it badly or sell it wrong and you will alienate people and curse shitty code for as long as you work there.

Sunday, March 12, 2017

Arkwright, by Allen M. Steele

book cover Inspired by the writings of classics like Asimov, Heinlein and Clarke, Arkwright is a short book that spans several centuries of space exploration and colonization, so after a very positive review on Io9, I've decided to read it. My conclusion: a reedited collection of poorly written shorts stories, it is optimistic and nostalgic enough to be read without effort, but it doesn't really teach anything. Like many of the works it was inspired from, it feels anachronistic, yet it was published in 2016, which makes me wonder why did anyone review this so positively. Perhaps if reviews would not word things so bombastically: "sweeping epic", "hard science fiction", etc. I would enjoy books that are clearly not so more.

Long story short, is starts with a group of 1939 science fiction writers, one of which eventually has a huge success. On his dying bed, he leaves his entire fortune to a foundation with the purpose to invest and support space colonization, in particular other star systems. Somehow, this seed money manages to successfully fund the construction of a beam sail starship which ends up putting people on another star's planet. Most of the book is the story of the family descendants who "live the dream" by monitoring the long journey of the automated ship.

First of all, I didn't enjoy the writing style. Episodic and descriptive, it felt more appropriate for a history book or a diary than a science fiction novel. Then the biases of the writer are more than made evident when he belittles antiscience protesters and religious colonists that believe in the starship as their god. It's not that I don't agree with him, but it was written so condescendingly that it bothered me. Same with the "I told you so" part with the asteroid on collision course with Earth. Same when the Arkwright descendants are pretty much strongarmed into getting into the family business. And third, while focusing on the Arkwright clan, the book completely ignored the rest of the world. While explaining how they designed and constructed and monitored a starship for generations, the author ignored any scientific breakthroughs that happened during that time. It is like the only people that cared about science and space expansion were the Arkwrights. It made the book feel very provincial. I would have preferred to see them in a global context, rather than read about their family issues.

I liked the sentiment, though. The idea that if you put your mind to something, you can do it. Of course, ignoring economic, technical and probabilistic realities does help when you write the book, but still. The story is centered on an old science fiction writer who takes humanity to another star, clearly something the author would have liked to have been autobiographical. It felt like one of those stories grandpas tell their children, all moral and wise, yet totally boring. It's not that they don't mean well and that the moral isn't good, but the way they tell it makes it unappetizing to small children. If I had to use one word to describe this book it is unappetizing

Funny thing is that I've read a similar centuries spanning book about the evolution of mankind that I liked a lot more and was much better written. I would suggest you don't read Arkwright and instead try Accelerando, by Charles Stross.

Friday, March 10, 2017

Learning ASP.Net MVC - Intermezzo - Upgrading to .Net Core 1.1

It is about time to revisit my series on ASP.Net MVC Core. From the time of my last blog post the .Net Core version has changed to 1.1, so just installing the SDK and running the project was not going to work. This post explains how to upgrade a .Net project to the latest version.

Learning ASP.Net MVC series:
  1. Setup
  2. MVC Concepts
  3. Authentication
  4. Entity Framework Fundamentals
  5. Upgrading project to .NET Core 1.1
  6. Dependency Injection and Services

Short version

Pressing the batch Update button for NuGet packages corrupted project.json. Here are the steps to successfully migrate a .Net Core project to a higher version.

  1. Download and install the .NET Core 1.1 SDK
  2. Change the version of the SDK in global.json - you can find out the SDK version by creating a new .Net Core project and checking what it uses
  3. Change "netcoreapp1.0" to "netcoreapp1.1" in project.json
  4. Change Microsoft.NETCore.App version from "1.0.0" to "1.1.0" in project.json
  5. Add
    "runtimes": {
        "win10-x64": { }
    to project.json
  6. Go to "Manage NuGet packages for the solution", to the Update tab, and update projects one by one. Do not press the batch Update button for selected packages
  7. Some packages will restore, but remain in the list. Skip them for now
  8. Whenever you see a "downgrade" warning when restoring, go to those packages and restore them next
  9. For packages that tell you to upgrade NuGet, ignore them, it's an error that probably happens because you restore a package while the previous package restoring was not completed
  10. For the remaining packages that just won't update, write down their names, uninstall them and reinstall them

Code after changes can be found on GitHub

That should do it. For detailed steps of what I actually did to get to this concise list, read on.

Long version

Step 0 - I don't care, just load the damn project!

Downloaded the source code from GitHub, loaded the .sln with Visual Studio 2015. Got a nice blocking alert, because this was a .NET Core virgin computer:
Of course, I could have tried to install that version, but I wanted to upgrade to the latest Core.

Step 1 - read the Microsoft documentation

And here I went to Announcing the Fastest ASP.NET Yet, ASP.NET Core 1.1 RTM. I followed the instructions there, made Visual Studio 2015 load my project and automatically restore packages:
  1. Download and install the .NET Core 1.1 SDK
  2. If your application is referencing the .NET Core framework, your should update the references in your project.json file for netcoreapp1.0 or Microsoft.NetCore.App version 1.0 to version 1.1. In the default project.json file for an ASP.NET Core project running on the .NET Core framework, these two updates are located as follows:

    Two places to update project.json to .NET Core 1.1

    Two places to update project.json to .NET Core 1.1

  3. to be continued...

I got to the second step, but still got the alert...

Step 2 - fumble around

... so I commented out the sdk property in global.json. I got another alert:

This answer recommended uninstalling old versions of SDKs, in my case "Microsoft .NET Core 1.0.1 - SDK 1.0.0 Preview 2-003131 (x64)". Don't worry, it didn't work. More below:

TL;DR; version: do not uninstall the Visual Studio .NET Core Tooling

And then... got the same No executable found matching command "dotnet=projectmodel-server" error again.

I created a new .NET core project, just to see the version of SDK it uses: 1.0.0-preview2-003131 and I added it to global.json and reopened the project. It restored packages and didn't throw any errors! Dude, it even compiled and ran! But now I got a System.ArgumentException: The 'ClientId' option must be provided. Probably it had something to do with the Secret Manager. Follow the steps in the link to store your secrets in the app. It then worked.

Step 1.1 (see what I did there?) - continue to read the Microsoft documentation

The third step in the Microsoft instructions was removed by me because it caused some problems to me. So don't do it, yet. It was
  1. Update your ASP.NET Core packages dependencies to use the new 1.1.0 versions. You can do this by navigating to the NuGet package manager window and inspecting the “Updates” tab for the list of packages that you can update.

    Package list in NuGet package manager UI in Visual Studio

    Updating Packages using the NuGet package manager UI with the last pre-release build of ASP.NET Core 1.1

Since I had not upgraded the packages, as in the Microsoft third step, I decided to do it. 26 updates waited for me, so I optimistically selected them all and clicked Update. Of course, errors! One popped up as more interesting: Package 'Microsoft.Extensions.SecretManager.Tools 1.0.0' uses features that are not supported by the current version of NuGet. To upgrade NuGet, see Another was even more worrisome: Unexpected end of content while loading JObject. Path 'dependencies', line 68, position 0 in project.json. Somehow the updating operation for the packages corrupted project.json! From a 3050 byte file, it now was 1617.

Step 3 - repair what the Microsoft instructions broke

Suspecting it was a problem with the NuGet package manager, I went to the link in the first error. But in Visual Studio 2015 NuGet is included and it was clearly the latest version. So the only solution was to go through each package and see which causes the problem. And I went to 26 packages and pressed Install on each and it worked. Apparently, the batch Update button is causing the issue. Weirdly enough there are two packages that were installed, but remained in the Update tab and also appeared in the Consolidate tab: BundleMinifier.Core and Microsoft.EntityFrameworkCore.Tools, although I can't to anything with them there.

Another package (Microsoft.VisualStudio.Web.CodeGeneration.Tools 1.0.0) caused another confusing error: Package 'Microsoft.VisualStudio.Web.CodeGeneration.Tools 1.0.0' uses features that are not supported by the current version of NuGet. To upgrade NuGet, see Yet restarting Visual Studio led to the disappearance of the CodeGeneration.Tools error.

So I tried to build the project only to be met with yet another project.json corruption error: Can not find runtime target for framework '.NETCoreAPP, Version=v1.0' compatible with one of the target runtimes: 'win10-x64, win81-x64, win8-x64, win7-x64'. Possible causes: [blah blah] The project does not list one of 'win10-x64, win81-x64, win7-x64' in the 'runtimes' [blah blah]. I found the fix here, which was to add
"runtimes": {
    "win10-x64": { }
to project.json.

It compiled. It worked.