Differences between Azure Functions v1 and v2 in C#

I’ve been messing around in the .NET ecosystem again and have jumped back in with Azure Functions (similar to AWS Lambda) to get my blog onto 99% static hosting. I immediately ran into the API changes between v1 and v2 (currently in beta).

These changes are because v1 was based around .NET 4.6 using WebAPI 2 while the v2 is based on ASP.NET Core which uses MVC 6. There are some guides around to converting but none in the pure context of Azure Functions.

I’ll illustrate with a PageViewCount sample that uses Table Storage to retrieve and update a simple page count.

v1 (.NET 4.61 / WebAPI 2)

[FunctionName("PageView")]
public static async Task<HttpResponseMessage> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get")]HttpRequestMessage req, TraceWriter log) {
    var page = req.MessageUri.ParseQueryString()["page"];
    if (String.IsNullOrEmpty(page))
        return req.CreateErrorResponse(HttpStatusCode.BadRequest, "'page' parameter missing.");

    var table = Helpers.GetTableReference("PageViewCounts");
    var pageView = await table.RetrieveAsync<PageViewCount>("damieng.com", page)
        ?? new PageViewCount(page) { ViewCount = 0 };
    var operation = pageView.ViewCount == 0
        ? TableOperation.Insert(pageView)
        : TableOperation.Replace(pageView);
    pageView.ViewCount++;
    await table.ExecuteAsync(operation);

    return req.CreateResponse(HttpStatusCode.OK, new { viewCount = pageView.ViewCount });
}

v2 (ASP.NET Core / MVC 6)

[FunctionName("PageView")]
public static async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get")]HttpRequest req, TraceWriter log) {
    var page = req.Query["page"];
    if (String.IsNullOrEmpty(page))
       return new BadRequestObjectResult("'page' parameter missing.");

    var table = Helpers.GetTableReference("PageViewCounts");
    var pageView = await table.RetrieveAsync<PageViewCount>("damieng.com", page)
        ?? new PageViewCount(page) { ViewCount = 0 };
    var operation = pageView.ViewCount == 0
        ? TableOperation.Insert(pageView)
        : TableOperation.Replace(pageView);
    pageView.ViewCount++;
    await table.ExecuteAsync(operation);

    return new OkObjectResult(new { viewCount = pageView.ViewCount });
}

Differences

The main differences are that:

  1. Return types are IActionResult/ObjectResult objects rather than extension methods against HttpRequestMessage (easier to mock/create custom ones)
  2. Input is the HttpRequest object rather than HttpResponseMessage (easier to get query parameters)

If you get the error ‘Can not create abstract class’ when executing your function then you are trying to use the wrong tech for that environment.

Helpers

Both classes above use a small helper class to take care of Table Storage which doesn’t have the nicest to use API. A wrapper much like a data context that ensures the right types go to the right table names might be an even better options.

static class Helpers {
    public static CloudStorageAccount GetCloudStorageAccount() {
        var connection = ConfigurationManager.AppSettings["DamienGTableStorage"];
        return connection == null ? CloudStorageAccount.DevelopmentStorageAccount : CloudStorageAccount.Parse(connection);
    }

    public static CloudTable GetTableReference(string name) {
        return GetCloudStorageAccount().CreateCloudTableClient().GetTableReference(name);
    }

    public static async Task<T> RetrieveAsync<T>(this CloudTable cloudTable, string partitionKey, string rowKey)
        where T:TableEntity {
        var tableResult = await cloudTable.ExecuteAsync(TableOperation.Retrieve<T>(partitionKey, rowKey));
        return (T)tableResult.Result;
    }
}

To compile

If you want to compile this or maybe you were just looking for code to do a simple page counter here’s the missing TableEntity class;

public class PageViewCount : TableEntity
{
    public PageViewCount(string pageName)
    {
        PartitionKey = "damieng.com";
        RowKey = pageName;
    }

    public PageViewCount() { }
    public int ViewCount { get; set; }
}

[)amien

Download files with progress in Electron via window.fetch

Working on Atom lately I need to be able to download files to disk. We have a couple of ways to do this today but they do not show download progress which leads to confusion and sometimes frustration on larger downloads such as updates or big packages.

There are many npm libraries out there but they either don’t expose a progress indicator or they bypass Chrome (thus not using proxy settings, caching and network inspector) by using node directly.

I’m also not a fan of sprawling dependencies to achieve what can be done simply in a function or two.

Hello window.fetch

window.fetch is a replacement for XMLHttpRequest currently shipping in Chrome (and therefore Electron) as well as a whatWG living standard. While there is some documentation around most of it relies on grabbing the entire content as JSON, a blob or text. This is not advised for streaming where the files might be large and you want to not only minimize memory impact but also display a progress indicator to your users.

Thankfully window.fetch has a getReader() function that will give you a ReadableStreamReader although this reads in chunks (32KB on my machine) and isn’t compatible with Node’s streams, pipes and data events.

Download function

With a little work though we can wire these two things up to get us a file downloader that has no extra dependencies outside of Electron, honors the Chrome cache, proxy and network inspector and best of all is incredibly easy to use;

import fs from 'fs';

export default async function download(sourceUrl, targetFile, progressCallback, length) {
  const request = new Request(sourceUrl, {
    headers: new Headers({'Content-Type': 'application/octet-stream'})
  });

  const response = await fetch(request);
  if (!response.ok) {
    throw Error(`Unable to download, server returned ${response.status} ${response.statusText}`);
  }

  const body = response.body;
  if (body == null) {
    throw Error('No response body');
  }

  const finalLength = length || parseInt(response.headers.get('Content-Length' || '0'), 10);
  const reader = body.getReader();
  const writer = fs.createWriteStream(targetFile);

  await streamWithProgress(finalLength, reader, writer, progressCallback);
  writer.end();
}

async function streamWithProgress(length, reader, writer, progressCallback) {
  let bytesDone = 0;

  while (true) {
    const result = await reader.read();
    if (result.done) {
      if (progressCallback != null) {
        progressCallback(length, 100);
      }
      return;
    }
    
    const chunk = result.value;
    if (chunk == null) {
      throw Error('Empty chunk received during download');
    } else {
      writer.write(Buffer.from(chunk));
      if (progressCallback != null) {
        bytesDone += chunk.byteLength;
        const percent = length === 0 ? null : Math.floor(bytesDone / length * 100);
        progressCallback(bytesDone, percent);
      }
    }
  }
}

A FlowType annotated version is also available.

Using it

Using it is simplicity - call it with a URL to download and a local file name to save it as along with an optional callback that will receive download progress.

Downloader.download('https://download.damieng.com/fonts/original/EnvyCodeR-PR7.zip', 'envy-code-r.zip', (bytes, percent) => console.log(`Downloaded ${bytes} (${percent})`));

Caveats

Some servers do not send the Content-Length header. You have two options if this applies to you;

  1. Don’t display a percentage just the KB downloaded count (percentage will be null in the callback)
  2. Bake-in the file size if it’s a static URL - just pass in as final parameter to the download function

Enjoy!

[)amien

Typography in bits: For a few pixels more

It’s been a while since I visited the bitmap fonts of old computers (see the bottom of the post for links) there are still some to look at!

There are a lot of subtle variations here as machines often used an off-the-shelf video chip and then made a few tweaks or had them slightly customized.

TRS-80 Color Computer & Dragon – custom MC6847 (1982)

TRS-80 system font

The initial model of the TRS 80 Color Computer – affectionately known as CoCo – as well as the UK’s Dragon 32 & 64 computers used the Motorola MC6847 character generator and so used the same embedded font.

Unusual characteristics

  • No lowercase
  • Serifs on B&D
  • Over-extended ‘7’
  • Asterisk is a diamond!
  • Square ‘O’
  • Cute ‘@’
  • Thin ‘0?’
  • Tight counter on ‘4’
  • Unjoined strokes on ‘#’

Rationale

The font has some rough edges although the softer fuzzier look of a CRT TV almost certainly fuzzed those out like many home computer fonts at the time. The awful dark-green on light-green color scheme wasn’t helping though.

Influences

Has similar proportions and characters to much of the Apple ][ font but feels like they tried to make the characters more distinguished on low-quality TV’s hence the serifs on B & D and the differentiation between 0 and O.

Technical notes

Motorola actually offered custom versions of this ROM so it would have been entirely possible to have an alternative character set.

TRS-80 Color Computer v2+ (1985)

TRS-80 v2+ system font

The follow-up v2 model of the TRS 80 Color Computer – also known as the Tandy Color Computer used an enhanced Motorola MC6847T1 variant.

Unusual characteristics

  • Serifs on B&D, over-extended 7 as per v1
  • Ugly ‘@’
  • Very soft center bar on ‘3’
  • Tight counter on ‘4’
  • Tight top of ‘f’

Rationale

In general a much improved font over the v1 fixing the oddities with the asterisk, O, 0, 3, 4, S, ? and # as well as making the slashes straighter and reducing the boldness of comma, colon, semi-colon and apostrophe although the @ and 3 are worse than the previous version.

Influences

Based on the previous model however lower-case does have some resemblance to Apple and MSX. This may in fact be a custom version as the spec sheet for the T1 variant has bold versions of ,;:.’ glyphs, shorter descenders on y and g, more curvature on p and q, stronger curves on 369, tighter t, semi-broken #

Technical notes

You can identify CoCo2 models that have the lower-case as they say Tandy on the screen not TRS-80.

Tatung Einstein (1984)

Tatung Einstein system font

The Tatung Einstein TC-01 was a British Z80 based machine launched in the UK that never really took off with the public but had some success in the game development word being a compiler and debugger for other more popular Z80 systems thanks to its CP/M compatible OS and disk system (it came with the same oddball 3″ disks used on the Sinclair ZX Spectrum +3 and Amstrad CPC/PCW range)</a>.

Unusual characteristics

  • Odd missing pixels on ‘9S’
  • Little flourishes on ‘aq’
  • Massively tall ‘*’
  • Chunky joins on ‘Kv’
  • High counters and bowls on ‘gpqy’

Rationale

Given the 40 column mode the generous spacing in 32 column mode makes sense and the font isn’t too bad. Many of the negative unusual characteristics would be lost on a CRT.

Influences

It feels like the Sinclair Spectrum font with some horizontal width sacrifices.

Commodore 128 (1985)

Commodore 128 80-column font

While the follow-up to the Commodore 64 used the exact same font at boot – it had the same VIC-II video chip – switching it into 80-column mode reveals a new font with double-height pixels powered by the MOS 8563 VDC.

Unusual characteristics

  • ‘£’ aligned left not right, thin strokes
  • ‘Q’ fails to take advantage of descender
  • Cluttered redundant stroke on ‘7’
  • Rounded ‘<>’

Rationale

Quite a nice font with very little weirdness that probably looked good on any monitor at the time although TV’s probably struggled to display detail with such fine verticals on some letters.

Influences

Technical

Switching to 80 column mode could be achieved by using the keyboard or the GRAPHIC 5 command.

Texas Instruments TI-99/4A (TMS9918) (1985)

TI-99/4A system font

The follow-up v2 model of the TRS 80 Color Computer – also known as the Tandy Color Computer used an enhanced Motorola MC6847T1 variant.

Unusual characteristics

  • Lower case is small caps
  • Serifs on ‘BD’
  • Square ‘O’
  • Poor slope on ‘N’
  • Bar very tight on ‘G’

Rationale

The lower-case small caps feels quite awful and appears to be an attempt to avoid having to deal with descenders. Other fonts brought the bowl up a line and descenders look a little off instead although some machines like the Sinclair QL just left space for them.

Influences

Based on the previous model however lower-case does have some resemblance to Apple and MSX.

Oric Atmos (1983)

Oric Atmos system font

The follow-up v2 model of the TRS 80 Color Computer – also known as the Tandy Color Computer used an enhanced Motorola MC6847T1 variant.

Unusual characteristics

  • Bold ‘{}’
  • Vertical line on ‘^’
  • Awkward horizontal stroke on ‘k’
  • Square ‘mw’

Rationale

Not a bad choice although I suspect cheaper TV’s would struggle with the non-bold and tight spacing which is probably why they went with high-contrast black-and-white.

Influences

A complete copy of the Apple ][ system font with only a few tweaks to remove over-extension of 6 and 9 and un-bolding [ and ] but they forgot { and } weirdly. Additions of ^ and £ don’t quite fit right.

[)amien

Random tips for PowerShell, Bash & AWS

Now freelance again I find myself solving a variety of unusual issues many of which I could find no online solutions for.

Given these no doubt plague other developers let’s share!

Pass quoted args from BAT/CMD files to PowerShell

Grabbing args from a batch/command files is easy – just use %* – but have you ever tried passing them to PowerShell like:

powershell "Something" "%*"

Unfortunately if one of your arguments has quotes around it (a filename with a space perhaps) then it becomes two separate arguments. e.g. "My File.txt" now becomes My and File.txt.

PowerShell will only preserve is if you use the -f option (to run a .PS1 file) but that requires a relaxed policy via Set-ExecutionPolicy and so is a no-go for many people.

Given you can’t make PowerShell do the right thing with the args the trick here is to not pass them as args at all!

SET MYPSARGS=%*
...
powershell -ArgumentList "$env:MYPSARGS"

Get Bash script path as Windows path

While Cygwin ships with cygpath to convert /c/something to c:\Something etc. MSYS Bash shells do not have this. However you can get it another way there:

#!/bin/sh
pushd "$(dirname "$0")" &gt; /dev/null
if command -v "cygpath" &gt; /dev/null; then
  WINPWD=""$(cygpath . -a -w)""
else
  WINPWD=""$(pwd -W)""
fi
popd &gt; /dev/null
echo $WINPWD

This works by switching the working directory to the one the script is in "$(dirname "$0")" and then capturing the print-working-directory command output using the -W option that grabs it in Windows format. It then pops the working directory to make sure it goes back to where it was.

Note that this uses forward slashes as a directory separator still – a lot of stuff is okay with that but older apps and tools are not.

JSON encoding in API Gateway mapping templates

Using Amazon’s AWS Lambda you’ll also find yourself touching API Gateway and while most of it is great the mapping templates are quite deficient in that they do not encode output by default despite specifying the MIME types.

All of Amazon’s example templates are exploitable via JSON injection. Just put a double-quote in a field and start writing your own JSON payload.

Amazon must fix this – encode by default like other templating systems have done such as ASP.NET Razor. Until then some recommend the Amazon-provided $util.escapeJavaScript() however while it encodes " as \" it also produces illegal JSON by encoding ' as \' .

The mapping language is Apache Velocity Template Language (VTL) and while not extendable the fine-print reveals that it internally uses Java strings and does not sandbox us. This let’s us utilize Java’s replace functionality:

#set($i = $input.path('$'))
{
   "safeString": "$i.unsafeString.replaceAll("\""", "\\""")
}

Show active known IPs on local network

I’m surprised more people don’t know how useful arp -a is especially if you pipe it into ping…

Bash

arp -a | grep -o '[0-9]\{1,3\}\.[0-9]\{1,3\}\.[0-9]\{1,3\}\.[0-9]\{1,3\}' | xargs -L1 ping -c 1 -t 1 | sed -n -e 's/^.*bytes from //p'

PowerShell

(arp -a) -match "dynamic" | Foreach { ping -n 1 -w 1000 ($_ -split "\s+")[1] } | where { $_ -match "Reply from " } | % { $_.replace("Reply from ","") }

Wrapping up

I just want to mention that if you are doing anything on a command-line be it Bash, OS X, PowerShell or Command/Batch then SS64 is a site worth visiting as they have great docs on many of these things!

[)amien