ITHCWY: Robert Ellison's Blog

Shapefile Update

A few people have asked for 3D shape support in my ESRI Shapefile library. I've never got around to it, but CodePlex user ekleiman has forked a version in his ESRI Shapefile to Image Convertor that supports PointZ, PolygonZ and PolyLineZ shapes. If that's what you need please check it out.

Crushing PNGs in .NET

Crushing PNGs in .NET

I'm working on page speed and Google PageSpeed Insights is telling me that my PNGs are just way too large. Sadly .NET does not provide any way to optimize PNG images so there is no easy fix - just unmanaged libraries and command line tools.

I have an allergy to manual processes so I've lashed up some code to automatically find and optimize PNGs in my App_Data folder using PNGCRUSH. I can call CrushAllImages() to fix up everything or CrushImage() when I need to fix up a specific PNG. Code below:

public static void CrushAllImages()
{
    try
    {
        string appDataRoot = HostingEnvironment.MapPath("~/App_Data");
        if (appDataRoot == null)
        {
            return;
        }

        DirectoryInfo directoryInfo = new DirectoryInfo(appDataRoot);
        FileInfo[] pngs = directoryInfo.GetFiles("*.png", SearchOption.AllDirectories);
        foreach (FileInfo png in pngs)
        {
            CrushImage(png.FullName);
        }
    }
    catch (Exception ex)
    {
        //...
    }
}

public static void CrushImage(string fullPath)
{
    if (string.IsNullOrEmpty(fullPath))
    {
        return;
    }

    try
    {
        string markerPath = Path.ChangeExtension(fullPath, ".cng");
        if (File.Exists(markerPath))
        {
            return;
        }

        string crushExe = HostingEnvironment.MapPath("~/App_Data/pngcrush_1_7_77_w32.exe");

        ProcessStartInfo psi = new ProcessStartInfo(crushExe, string.Format(CultureInfo.InvariantCulture, "\"{0}\" \"{1}\"", fullPath, markerPath));
        psi.UseShellExecute = false;
        psi.CreateNoWindow = true;
        psi.LoadUserProfile = false;
        psi.WorkingDirectory = HostingEnvironment.MapPath("~/App_Data");

        Process p = Process.Start(psi);
        if (p == null)
        {
            throw new InvalidOperationException("No Process!");
        }
        p.WaitForExit();

        if (File.Exists(markerPath))
        {
            if (p.ExitCode == 0)
            {
                File.Copy(markerPath, fullPath, true);
                File.WriteAllText(markerPath, "Processed");
            }
            else
            {
                SiteLog.Log.Add(LogSeverity.Error, "CrushImage Failed (non-0 exit code) for " + fullPath);
                File.Delete(markerPath);
            }
        }
    }
    catch (Exception ex)
    {
       // ...
    }
}

Minify and inline CSS for ASP.NET MVC

ASP.NET has a CssMinify class (and a JavaScript variant as well) designed for use in the bundling pipeline. But what if you want to have your CSS minified and inline? Here is an action that is working for me (rendered into a style tag on my _Layout.cshtml using @Html.Action("InlineCss", "Home")).

public ActionResult InlineCss()
{
    BundleContext context = new BundleContext(
        new HttpContextWrapper(System.Web.HttpContext.Current), 
        BundleTable.Bundles, 
        "~/Content/css");
            
    Bundle cssBundle = BundleTable.Bundles.GetBundleFor("~/Content/css");
    BundleResponse response = cssBundle.GenerateBundleResponse(context);
           
    CssMinify cssMinify = new CssMinify();
    cssMinify.Process(context, response);

    return Content(response.Content);
}

Note that I'm using this to inline CSS for this blog. The pages are cached so I'm not worried about how well this action performs. My blog is also basically all landing pages so I'm also not worried about caching a non-inline version for later use, I just drop all the CSS on every page.

Personal Finger Daemon for Windows

Did you know that Windows still has a vestigial finger command with just about nothing left to talk to? One of my New Year's resolutions is to bring finger back and unlike the stalled webfinger project I need to make some progress. Here's some C# to run your own personal finger daemon... you just need to create a .plan file in your home directory (haven't done that for a while):

using System;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Net.Sockets;
using System.Text;
using System.Threading;

namespace fingerd
{
    static class Program
    {
        private const int FingerPort = 79;
        private const int MaxFingerCommand = 256;
        private const string PlanFile = ".plan";

        private static readonly TcpListener _tcpListener = new TcpListener(
             IPAddress.Any, FingerPort);

        static void Main()
        {
            _tcpListener.Start();

            while (true)
            {
                TcpClient tcpClient = _tcpListener.AcceptTcpClient();
                Thread clientThread = new Thread(ClientThread);
                clientThread.Start(tcpClient);
            }
        }

        static void ClientThread(object client)
        {
            NetworkStream clientStream = null;
            TcpClient tcpClient = client as TcpClient;
            if (tcpClient == null) { return; }

            try
            {
                byte[] command = new byte[MaxFingerCommand];
                clientStream = tcpClient.GetStream();

                int read = clientStream.Read(command, 0, command.Length);
                if (read == 0) { return; }

                ASCIIEncoding asciiEncoding = new ASCIIEncoding();
                string commandText = asciiEncoding.GetString(command);

                int endOfCommand = commandText.IndexOf("\r\n"
                    StringComparison.InvariantCultureIgnoreCase);
                if (endOfCommand <= 0) { return; }

                string user = commandText.Substring(0, endOfCommand);
                if (string.Compare(user, Environment.UserName, 
                    StringComparison.InvariantCultureIgnoreCase) != 0) { return; }

                string planPath = Path.Combine(Environment.GetFolderPath(
                    Environment.SpecialFolder.UserProfile),
                    PlanFile);
                if (!File.Exists(planPath)) { return; }

                string plan = File.ReadAllText(planPath) + "\r\n";
                byte[] planBytes = asciiEncoding.GetBytes(plan);
                clientStream.Write(planBytes, 0, planBytes.Length);
            }
            catch (Exception ex)
            {
                Debug.WriteLine(ex);
            }
            finally
            {
                if (clientStream != null)
                {
                    clientStream.Close();
                }
                tcpClient.Close();
            }
        }
    }
}

Fix search on enter problem in BlogEngine.NET

Search on enter has been broken for a while in BlogEngine.NET (I'm running the latest 2.8.0.1 version). Finally got a chance to look at this today and there is a simple patch to the JavaScript to fix it. See the issue I just filed on CodePlex for details.

How to get SEO credit for Facebook Comments (the missing manual)

How to get SEO credit for Facebook Comments (the missing manual)

I've been using the Facebook Comments Box on this blog since I parted ways with Disqus. One issue with the Facebook system is that you won't get SEO credit for comments displayed in an iframe. They have an API to retrieve comments but the documentation is pretty light and so here are three critical tips to get it working.

The first thing to know is that comments can be nested. Once you've got a list of comments to enumerate through you need to check each comment to see if it has it's own list of comments and so on. This is pretty easy to handle.

The second thing is that the first page of JSON returned from the API is totally different from the other pages. This is crazy and can bite you if you don't test it thoroughly. For https://developers.facebook.com/docs/reference/plugins/comments/ the first page is https://graph.facebook.com/comments/?ids=https://developers.facebook.com/docs/reference/plugins/comments/. The second page is embedded at the bottom of the first page and is currently https://graph.facebook.com/10150360250580608/comments?limit=25&offset=25&__after_id=10150360250580608_28167854 (if that link is broken check the first page for a new one). The path to the comment list is "https://developers.facebook.com/docs/reference/plugins/comments/" -> "comments" -> "data" on the first page and just "data" on the second. So you need to handle both formats as well as the URL being included as the root object on the first page. Don't know why this would be the case, just need to handle it.

Last but not least you want to include the comments in a way that can be indexed by search engines but not visible to regular site visitors. I've found that including the SEO list in the tag does the trick, i.e.

<fb:comments href="..." width="630" num_posts="10">*Include SEO comment list here*</fb:comments>

I've included the source code for an ASP.NET user control below - this is the code I'm using on the blog. You can see an example of the output on any page with Facebook comments. The code uses Json.net.

FacebookComments.ascx:

<%@ Control Language="C#" AutoEventWireup="true" CodeFile="FacebookComments.ascx.cs" 
  Inherits="LocalControls_FacebookComments" %>

FacebookComments.ascx.cs

using System;
using System.Collections.Generic;
using System.Globalization;
using System.Net;
using System.Text;
using System.Web;
using System.Web.Caching;
using Newtonsoft.Json.Linq;

// ReSharper disable CheckNamespace
public partial class LocalControls_FacebookComments : System.Web.UI.UserControl
// ReSharper restore CheckNamespace
{
    private const string CommentApiTemplate = "https://graph.facebook.com/comments/?ids={0}";
    private const string CacheTemplate = "localfacebookcomments_{0}";
    private const int CacheHours = 3;

    public string PostUrl { get; set; }

    protected void Page_Load(object sender, EventArgs e)
    {
        try
        {
            if (!string.IsNullOrWhiteSpace(PostUrl))
            {
                string cacheKey = string.Format(CultureInfo.InvariantCulture, 
                    CacheTemplate, PostUrl);

                if (HttpRuntime.Cache[cacheKey] == null)
                {
                    StringBuilder commentBuilder = new StringBuilder();

                    string url = string.Format(CultureInfo.InvariantCulture,
                                               CommentApiTemplate,
                                               PostUrl);

                    while (!string.IsNullOrWhiteSpace(url))
                    {
                        string json;
                        using (WebClient webClient = new WebClient())
                        {
                            json = webClient.DownloadString(url);
                        }

                        // parse comments
                        JObject o = JObject.Parse(json);
                        if ((o[PostUrl] != null) &&
                            (o[PostUrl]["comments"] != null) &&
                            (o[PostUrl]["comments"]["data"] != null))
                        {
                            // first page
                            AppendComments(o[PostUrl]["comments"]["data"], commentBuilder);
                        }
                        else if (o["data"] != null)
                        {
                            // other pages
                            AppendComments(o["data"], commentBuilder);
                        }
                        else
                        {
                            break;
                        }

                        // next page URL
                        if ((o[PostUrl] != null) &&
                            (o[PostUrl]["comments"] != null) &&
                            (o[PostUrl]["comments"]["paging"] != null) &&
                            (o[PostUrl]["comments"]["paging"]["next"] != null))
                        {
                            // on first page
                            url = (string) o[PostUrl]["comments"]["paging"]["next"];
                        }
                        else if ((o["paging"] != null) &&
                                 (o["paging"]["next"] != null))
                        {
                            // on subsequent pages
                            url = (string) o["paging"]["next"];
                        }
                        else
                        {
                            url = null;
                        }
                    }

                    string comments = commentBuilder.ToString();

                    HttpRuntime.Cache.Insert(cacheKey,
                        comments,
                        null,
                        DateTime.UtcNow.AddHours(CacheHours),
                        Cache.NoSlidingExpiration);

                    LiteralFacebookComments.Text = comments;
                }
                else
                {
                    LiteralFacebookComments.Text = (string)HttpRuntime.Cache[cacheKey];
                }
            }
        }
        catch (Exception)
        {
            LiteralFacebookComments.Text = string.Empty;
        }
    }

    private static void AppendComments(IEnumerable comments, 
        StringBuilder commentBuilder)
    {
        foreach (JObject comment in comments)
        {
            // write comment
            commentBuilder.AppendFormat(CultureInfo.InvariantCulture,
                                        "
{0} ({1})

\r\n"
,
                                        comment["message"],
                                        comment["from"]["name"]);

            // also write any nested comments
            if ((comment["comments"] != null) && (comment["comments"]["data"] != null))
            {
                AppendComments(comment["comments"]["data"], commentBuilder);
            }
        }
    }
}

The curious case of the missing slugs (in BlogEngine.net 2.8)

2013-06-16 Update: There is now a patch for the issue discussed below.

I just upgraded to BlogEngine.net 2.8 as it contains a fix for broken links from Facebook. There were a couple of hitches that I'll share in case they help anyone else.

I messed up the first upgrade attempt because the updater utility updates the source folder (containing the newly downloaded 2.8 code) instead of the destination folder (containing the current version of your blog). This is a little odd and the result is I uploaded an unchanged instance and then embarrassingly complained the the Facebook bug hadn't been fixed. It had, just not in the folder I was expecting. I probably didn't pay enough attention to the instruction video.

Having got that out of the way I discovered that new posts were appearing with a bad link (to /.aspx instead of /blog-title.aspx). I rarely post using the editor as I have a home-grown post by email service running. After a bit of digging it turns out that prior to 2.8 you could leave the slug empty when creating a post but now this results in the bad link. Luckily there isn't much effort require to fix this, you just need to set the slug before saving the new post:

if (string.IsNullOrWhiteSpace(post.Slug))
{
    post.Slug = Post.GetUniqueSlug(post.Title, post.Id);
}

In the middle of playing with this my live site died and started returning a 500 error. No amount of uploading the working local copy would fix this. Happily Server Intellect have outstanding support and restored a working backup for me in the middle of the night. Thanks chaps!

Catfood: Earth for Android 1.10

Catfood Earth for Android 1.10

I’ve just released Catfood Earth for Android 1.10. You can control the center of the screen manually (the most requested new feature) and also tweak the transparency of each layer and the width of the terminator between day and night. It also starts a lot faster and has fewer update glitches. Grab it from Google Play if this looks like your sort of live wallpaper.

.NET 2.0 and Windows 8

Inexplicably .NET 2.0, 3.0 and 3.5 are not installed by default in Windows 8 and can’t be installed using the redistributables that worked with previous versions of Windows. You have to go digging in Windows Features to get anything older than 4.0.

Reviews and Links for August 2012

The Last Policeman by Ben H. Winters

5/5

Stonking police procedural set in the months leading up to a global catastrophe.

 

Kill Decision by Daniel Suarez

5/5

Excellent techno-thriller. A little more serious and focused than Daemon and Freedom (TM). It's about a worst case drone scenario, ants, extra-special forces and some smart birds. Very good.

 

Links

Windows 8, Users 0? http://t.co/966Cuwjz

Bill Nye declares Todd Akin "fucking idiot"; issues debate challenge http://t.co/AZ3k55Y4 #fb

ITHCWY: Fight Facebook with Email: I was a little saddened to read today that Diaspora is transitioning over to… http://t.co/2G0pDdu0

Diaspora Founders To Move On, Handing Over Decentralized Social Network ‘To The Community’ http://t.co/KQGb2kpv -- sad, but not the future

RT @MargaretAtwood: Just used http://t.co/Nhna2CGO for gruesome printer problem: excellent, done in 10 mins! Tks to S H E F I N. Website ...

Check out Catfood Earth Live Wallpaper on Google Play! https://t.co/NTJQ1sYL

ITHCWY: Twenty-Four Hours with Twilio: I've wanted to play with Twilio's voice and SMS service for a while and… http://t.co/KOK0PG2M

Tuesdays http://t.co/BLNiCP3H

Twitter Cuts Off Tumblr's Ability to Find Friends http://t.co/1g3ZcClf

5 of 5 stars to The Last Policeman by Ben H. Winters http://t.co/9NF2nviH

XML: http://t.co/VadVt321 #rofl

Windows 8 Is Now Available For Developers (And For Everybody Else, There’s A 90-Day Free Trial, Too) http://t.co/iu1li6BV

Gotye's YouTube orchestra remix of "Somebody That I Used to Know" http://t.co/OUEXXltQ

ITHCWY: City by the Bay: View from Bernal Hill this afternoon. http://t.co/11cI3ctk

Nice panorama! Curiosity rover: Martian solar day 2 #360pano http://t.co/w1H2ocUm via @360cities

ITHCWY: Share a picture in MonoDroid: Here’s how to share a picture to Facebook, Twitter and so forth from… http://t.co/pByzvjmx

5 of 5 stars to Kill Decision by Daniel Suarez http://t.co/TKgUMNW0

BBC News - Mars rover makes first colour panorama http://t.co/fZ7u8smZ

How Apple and Amazon Security Flaws Led to My Epic Hacking http://t.co/BDmRAbm3

Via KQED Guides: Guide to Bay Area Tidepools: Where to Explore Amazing Marine Life | http://t.co/TDBRDnTD #todo @myEN

ITHCWY: Catfood: WebCams for Android: I’ve just released a WebCam app for Android. It’s based on WebCamSaver but… http://t.co/azZUwkkz

Pay for a new social network with no ads? https://t.co/2tox3c2y Anyone I know going to be on there? #fb

Curiosity http://t.co/D2cyWE66

BBC News - Photo shows Mars rover descent http://t.co/KfOv1qOF

Catfood WebCams for Android - Catfood Software http://t.co/VzeySq90 via @CatfoodSoftware

Check out Catfood WebCams on Google Play! https://t.co/VTU8YiBd

ITHCWY: Not a Private Key: When jarsigner says "Key Associated with [alias] not a private key" it almost certainly… http://t.co/3sk89ENV

ITHCWY: Sending email via GMail in C#/.NET using SmtpClient: I’ve stubbed my toe on this a couple of times, so here… http://t.co/QJ7YjcjI

Help end patent litigation insanity and tell your congress person to back SHIELD. http://t.co/27anadBt

ITHCWY: Support SHIELD–a small measure of patent sanity: A friend pointed me at the SHIELD (PDF) act today. This… http://t.co/ArXHgZ0e

ITHCWY: Thank you for choosing HSA Bank!: No, thank you HSA Bank for not giving me a choice and then cheekily… http://t.co/9torSXCq

Bill would force patent trolls to pay defendants’ legal bills | Ars Technica http://t.co/poB3zzlX +1, via @sr00t

What a happy coincidence. As well as #IPAday it's also goof off at work day: http://t.co/7AItItWq

Apparently it's #IPADay - luckily there's some @21stAmendment in the fridge. http://t.co/jyYPSePC