Skip to content

Author: Lucas

Resident of Akron Ohio, Seasoned Technologist and Software Engineer, President of Holeshot Software, Homebrewer, BMW Enthusiast, Sigma Tau Gamma Alumni from Miami University

Super Rig Part Deux – Enter Aries

It was time.  The computer from my last super rig in 2009 needed an upgrade.  That thing has been literally powered on for 9 years, and, as a matter of fact, it is still on (just not my primary dev machine anymore)…  But, it no longer could handle the rigors of my natively compiled UWP development and media work I have been doing (well, at least not to the level of my patience).  Since I had such good luck with it over the years, I decided to build my next rig this year, and its a doozy.  I decided to go all out on this one.  This post will highlight the specs.

Intel – Core i9-7980XE 2.6GHz 18-Core Processor (https://siliconlottery.com/collections/all/products/7980xe43g)

I got it from Silicon Lottery (https://siliconlottery.com/).  I was planning on over clocking it (2.6Ghz isn’t all that great for single threaded performance), and for a few hundred extra bucks they de-lid it and guarantee it up to a certain amount of clock speed if you follow their specs.  I got the one that was good up to 4.3Ghz.  I am thoroughly convinced that my last rig has lasted so long because I took great care in keeping everything cool temperature wise, so de-lidding the processor was interesting to me, but not something I wanted to try myself with a two thousand dollar chip, so I left it up to them, and I think it was worth it. A lot of my specs after choosing this chip come from Silicon Lottery’s QVL.

Asus – ROG RAMPAGE VI EXTREME EATX LGA2066 Motherboard (https://www.asus.com/us/Motherboards/ROG-RAMPAGE-VI-EXTREME/)

This MOBO is beast mode.  You don’t have a lot of choices when you get into the X299 socket in the first place.  And to further limit this list, the Silicon Lottery QVL only allows this one plus 3 other possibilities.   This is the most feature robust one in the list in my opinion, but it’s pricey…

MVIMG_20171203_103031  MVIMG_20171203_103026 

G.Skill – Trident Z RGB 128GB (8 x 16GB) DDR4-2400 Memory (https://www.newegg.com/Product/Product.aspx?Item=N82E16820232553)

So why did I name it ARIES?  Well, it is my birth sign for one.  But more importantly, the mascot for Aries is…..RAM!!!  That’s not a typo, 128 GB RAM.  I’m not actually one for all the LED lighting and stuff, but since I got a clear case (more on that below), I figured I would get a blue motif going, and the memory can match, hence the RGB version.  If you didn’t want the RGB, the Ripjaws are identical with a heat sink.  I’m not sure I would ever buy any other brand of memory (fingers crossed).  I have had G. Skill in all of my builds and it has been literally flawless.

Aries-w  IMG_20171203_110440

AMD – Vega Frontier Edition Liquid 16GB Frontier Edition Liquid Video Card (https://pro.radeon.com/en/product/radeon-vega-frontier-edition/)

This build isn’t a gaming PC, so I didn’t get a gaming video card.  It is for work.  However, this card is unique in that it can run Professional, non-consumer, workstation drivers to develop, and can switch on the fly to gaming drivers to test what you developed, so it’s kind of a hybrid.  It also has 16GB VRAM which is HMB2 instead of DDR, which is important for me to drive my 6 monitors.  Everyone has an opinion on video cards, I’m not trying to start a debate.  This one makes me pretty happy currently.  It does have some pretty sweet looking blue LEDs on a gold frame as well.  It’s also liquid cooled, and has a 120mm radiator.  It supports up to 4 monitors natively, so I’m using a Club3d MST hub to get to 6.

MVIMG_20171124_222504   MVIMG_20171203_110426  8e99ecb8-cfaa-4b0c-ac66-29718a53c934-large (6 Acer 1920×1080, bottom row are touch screens)

Samsung – 960 EVO 1TB M.2-2280 Solid State Drive x 3 (https://www.samsung.com/us/computing/memory-storage/solid-state-drives/ssd-960-evo-m-2-1tb-mz-v6e1t0bw/)

A friend told me once that the feeling I got when I moved from HDD to SSD is ten fold going from SSD to M.2.  I’m not sure if it’s quite that much, but these are fast, and they aren’t even the PRO’s.  My MOBO holds 3 of them, so I got 3 TB’s worth.  All of my hard drives in my last PC were Samsung, and not one of them has failed in 9 years, so I stuck with them.

EVGA – SuperNOVA T2 1600W 80+ Titanium Certified Fully-Modular ATX Power Supply (https://www.evga.com/products/product.aspx?pn=220-T2-1600-X1)

Another one of the reasons that I think my last rig lasted for so long was because I gave it a lot of headroom.  No single component was being taxed to it’s limit.  That Frontier Vega card can be an energy hog taking upwards of 350 watts, and so can the 18 core/36 thread overclocked chip.  So I had to go with the biggest power supply I could find.  Its fully modular, which I really like.  Since I didn’t run any SATA drives or anything, I am only using the video card power supply and the 12V rail, and being modular meant I didn’t have as many cables to hide.

MVIMG_20171129_154734  MVIMG_20171129_154754  MVIMG_20171129_154750  MVIMG_20171129_154804

Thermaltake – Floe Riing RGB 360 TT Premium Edition 42.3 CFM Liquid CPU Cooler (http://www.thermaltake.com/products-model.aspx?id=C_00003122)

Going along with the “keep everything overly cool” theme, as well as the Silicon Lottery QVL calling for a 360mm radiator, this one was my choice.  It was easy to install, looks really cool, and I even have the RGB’s giving me a visual indicator of the temperature of the CPU.  As in, when it’s cool the lights are blue, and when it gets hotter it goes from green, yellow, and red (DANGER)! My CPU at idle is running about 28-29 degrees Celsius, and considering the 7980XE is supposed to be hot, I’m thrilled with that.  Under Prime95 stress testing it doesn’t get hotter than 69 degrees after prolonged stress, and bitcoin mining with all 18 cores with MinerGate it hovers around 55-60 degrees.  I used Thermaltake MX4 thermal paste per the QVL.

MVIMG_20171129_155011  IMG_20171204_095205   MVIMG_20171204_095201 

Cooler Master – COSMOS C700P ATX Full Tower Case (http://www.coolermaster.com/case/full-tower/cosmos-c700p/)

With the size of my components, I wanted a BIG case and believe me when I say it: this thing is HUGE.  Having all the room helps with cooling, a lot of the parts are huge and would be cumbersome to fit into something smaller, and I have a lot of room on my desk to support a large case.  The reviews on the case are, shall we say, mixed….But I have found it to be an extraordinary case.  It looks really sharp, has a tempered glass door to show off your LEDs, the iconic COSMOS handles (although my final build weighs over 75 pounds, so I’m not taking it anywhere), and so far seems to be really well built.

MVIMG_20171129_154740  IMG_20171128_163300

Cougar – Dual-X 73.2 CFM 140mm Fan (https://www.amazon.com/Cougar-CFD14HBG-140mm-Cooling-Green/dp/B00C42TJ54)

As a personal rule, I always replace the stock fans with aftermarket ones that push more CFM.  I also try to fill up every slot I can with an aftermarket fan.  My old PC was LOUD from all the fans, it sounded like a helicopter taking off in office constantly.  This one has more fans, and it is whisper quiet.  Like, it’s almost weird it’s so quiet.  You can occasionally hear the burble of the water block because it makes literally zero noise.  These are blue LED (per my motif), and a hydraulic bearing, and I am quite pleased with them so far.  I have my 360mm radiator blowing in, one of these on the bottom of the case blowing cold air in from the floor, one of these as an exhaust fan in the back, and 2 of these on the top as exhaust fans.  I have the liquid cooled video card radiator blowing in.  So, by some crude calculations, I have a slightly balanced pressure in my case.

MVIMG_20171129_155005

Overclocking

Currently I am overclocked about 71%, running all 18 cores at 4.0Ghz.  The QVL and warranty allow me to go up to 4.3, but I wanted a little headroom.  And it is quite stable.  I have few and far between lockups, and I am currently blaming the relatively new drivers for the video card, they appear to be the culprit anyway. 

My old Novabench score:

694a22ee-6422-4120-b3aa-f245df719760

And my new Novabench score (picture is from a 4.3 Ghz overclock that my CPU is rated for, taking it to 4.0 barely affected the score really):

0708af08-5559-4b28-a4d5-bb5f31910b88

The improvement is INSANE.  Considering my old PC was still a pretty good computer, Aries blows it away, and I can tell from daily use.  Visual Studio loads up in a blink of an eye, and compiling our native UWP application went from 30 minutes, to about 4.

Total cost of the build was ~$8500 (not counting the monitors or audio stuff noted below).  Full part list here: (https://pcpartpicker.com/user/lkrammes/saved/#view=vPVm8d)

I work for a collaboration company currently where we work a lot in the media space (http://collaborationsolutions.com/products/shared-media-player/), and we are scattered all over the world, so we spend a lot of time on Microsoft Teams video chatting.  I wanted the freedom of being able to talk without a headset, so I have a fairly complex audio visual setup:

Creative Sound Blaster X7 (http://www.soundblaster.com/x7/) and sound setup

This is a full featured DAC, not just a sound card.  You literally could run your home theater setup with it.  I have it wired to 2 Bose studio monitors, and a 10 inch Polk Audio subwoofer (https://www.polkaudio.com/products/psw10).  The sound quality is simply amazing.  My audio engineer friend told me that it was the best sounding computer he has ever heard when we were tuning the EQ.  To give me the freedom to walk around my office and talk, I have an Audio-Technica PRO 44 Cardioid Condenser Boundary Microphone mounted above my top row of monitors (http://www.audio-technica.com/cms/wired_mics/8ba9f72f1fc02bc5/index.html).  It requires phantom power, and the X7 didn’t have enough juice to push it, so I had to add a phantom power supply (https://www.pyleaudio.com/sku/PS430/Compact-1-Channel-48V-Phantom-Power-Supply)

For the video, I went with the Logitech BRIO (https://www.logitech.com/en-us/product/brio) and man, this thing is sharp.  The video quality is amazing.  I get lots of comments on how clear I am in the video conference.

Das Keyboard 4 Professional (https://www.daskeyboard.com/daskeyboard-4-professional/) and Level 10 M Diamond Black Mouse (http://www.ttesports.com/Mice/39/LEVEL_10_M_Diamond_Black/productPage.htm?a=a&g=ftr#.WlV3w6inGUk)

I am a fan of the phrase “if you use something every day, buy the best one you can afford”.  Just like a mechanic would want Snap-On or Mac Tools, I wanted a good keyboard and mouse.  These are simply my favorites.  The keyboard has Cherry MX mechanical keys that I find a joy to type with, and having a volume control knob at my fingertips is one of my favorite features.  The mouse is a joint design effort from Thermaltake and BMW.  For those that know me, they know my obsession with BMW…(love this mouse).

ROG Rapture GT-AC5300 Router (https://www.asus.com/us/Networking/ROG-Rapture-GT-AC5300/)

This thing is a beast as well.  It’s super fast, and has a ton of range (I almost don’t need my access points in the house, of which I had 3 to get coverage everywhere).  I currently have it plugged in to 2 ASUS AC68U’s (https://www.asus.com/us/Networking/RTAC68U/) as AP’s, one on each floor, router in the basement, to get full coverage all over my house and yard.  Recently, ASUS announced that they are going to let the AC68U to be used as an AI Mesh node in conjuntion with the Rapture (https://www.asus.com/AiMesh/), which I will be experimenting with once the firmware gets out of beta.  My neighbors tell me that my WiFi signal is higher in their house than their own router…

So, that’s my new system.  So far I couldn’t be happier.  What do you think?

Pre cable cleanup:

MVIMG_20171203_125038

Post cable cleanup:

MVIMG_20171203_110426 

Ready to rock!

IMG_20171213_165929

1 Comment

Using Azure Media Services to get metadata from a media file

Azure Media Services is a great tool for encoding all types of media.  One of it’s major advantages is that it accepts a bunch of different input types (AMS supported file types).  So you can almost agnostically give it a video file, and get an mp4 as output (amongst a myriad of other things it can do).  However, with the current version, you cant get an information about a video file until AFTER it has been transcoded.

I wanted to get information about the input video file BEFORE it is transcoded….

Sure, there are packages that can do this (MediaInfo although you can’t get audio channel info, FFProbe, TaglibSharp, etc.), but most if not all require the file to be written to disk.  That is a problem if you are looking at a byte array from blob storage, or want to get that information from a stream uploaded from a web client without writing it to disk.

So I applied a little hack to use AMS to get the audio and video metadata from a video file, and I need it quickly, so I don’t want to encode the entire video.

First, you need a JSON preset to perform the simplest (see fastest) of AMS tasks, generate a single thumbnail.

{
   "Version":1.0,
   "Codecs":[
      {
         "Type":"PngImage",
         "PngLayers":[
            {
               "Type":"PngLayer",
               "Width":640,
               "Height":360
            }
         ],
         "Start":"{Best}"
      }
   ],
   "Outputs":[
      {
         "FileName":"{Basename}_{Index}{Extension}",
         "Format":{
            "Type":"PngFormat"
         }
      }
   ]
}

This will create a single thumbnail, and it will ask AMS to generate the “best” one, meaning it uses it’s brain to figure out the most relevant thumbnail, so you don’t get a blank one because the first few seconds of the video are black.

IJob metaDataJob = CreateJob(context, $"{inputAsset.Asset.Name}-Metadatajob");                    
IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard", inputAsset.GetMediaContext());

ITask task = job.Tasks.AddNew(taskName,
        processor,
        EncodingPreset.GetBaseThumnailOnlyEncodingPreset(), // this is the JSON from above,
        TaskOptions.None);
task.InputAssets.Add(inputAsset);

task.OutputAssets.AddNew($"Thumbnail_{inputAsset.Name}", AssetCreationOptions.None);

await metaDataJob.SubmitAsync();

// Check job execution and wait for job to finish.
  Task progressJobTask = job.GetExecutionProgressTask(CancellationToken.None);
 progressJobTask.Wait();

// Get a refreshed job reference after waiting on a thread.
metaDataJob = GetJob(metaDataJob.Id, context);

// Check for errors
if (metaDataJob.State == JobState.Error)
{                    
  return;
}

I have found that regardless of the video size or video length, this process takes an average of 40 seconds (if you have a media encoder unit available, and you should if you utilize this technique: Auto Scaling Media Reserved Units in AMS .  And, so that not all is lost, you likely will have a use for this thumbnail if you are doing this sort of thing in the first place.

Once the job is complete, you can see that the output for the job is a single thumbnail, and an _metadata.xml file

image

Now we must assign a Sas locator to it so we can download that xml file:

IAsset outputAsset = job.OutputMediaAssets[0];
ILocator outputLocator = outputAsset.Locators.Where(l => l.Type == LocatorType.Sas).FirstOrDefault() ??
outputAsset.GetOrCreateLocator(LocatorType.Sas, AccessPermissions.Read | AccessPermissions.List | AccessPermissions.Write, AssetManager.CalculateExpirationDate(message));               
IAssetFile outputAssetXmlFile = outputAsset.AssetFiles.Where(file => file.Name.EndsWith(".xml")).First(); //gets guid_metadata.xml
Uri xmlUri = outputAssetXmlFile.GetSasUri(outputLocator);

By the way, GetOrCreateLocator is an extension to help reuse locators since AMS limits you to only 5:

public static ILocator GetOrCreateLocator(this IAsset asset, LocatorType locatorType, AccessPermissions permissions, TimeSpan duration, DateTime? startTime = null, TimeSpan? expirationThreshold = null)
{
  MediaContextBase context = asset.GetMediaContext();

  ILocator assetLocator = context.Locators.Where(l => l.AssetId == asset.Id && l.Type == locatorType).OrderByDescending(l => l.ExpirationDateTime).ToList().Where(l => (l.AccessPolicy.Permissions & permissions) == permissions).FirstOrDefault();

  if (assetLocator == null)
  {
    // If there is no locator in the asset matching the type and permissions, then a new locator is created.
    assetLocator = context.Locators.Create(locatorType, asset, permissions, duration, startTime);
  }
  else if (assetLocator.ExpirationDateTime <= DateTime.UtcNow.Add(expirationThreshold ?? DefaultExpirationTimeThreshold))
  {
    // If there is a locator in the asset matching the type and permissions but it is expired (or near expiration), then the locator is updated.
    assetLocator.Update(startTime, DateTime.UtcNow.Add(duration));
  }

  return assetLocator;
}

Ok, so now we have an xml file containing the metadata, and it looks like this:

<?xml version="1.0"?>
<AssetFiles xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/windowsazure/mediaservices/2014/07/mediaencoder/inputmetadata">
  <AssetFile Name="Taylor_Swift_-_Blank_Space_mp4_ItemId(1040).mp4" Size="62935612" Duration="PT4M32.463S" NumberOfStreams="2" FormatNames="mov,mp4,m4a,3gp,3g2,mj2" FormatVerboseName="QuickTime / MOV" StartTime="PT0S" OverallBitRate="1847">
    <VideoTracks>
      <VideoTrack Id="1" Codec="h264" CodecLongName="H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10" TimeBase="1/90000" NumberOfFrames="6531" StartTime="PT0S" Duration="PT4M32.355S" FourCC="avc1" Profile="High" Level="4.0" PixelFormat="yuv420p" Width="1920" Height="1080" DisplayAspectRatioNumerator="16" DisplayAspectRatioDenominator="9" SampleAspectRatioNumerator="1" SampleAspectRatioDenominator="1" FrameRate="23.980" Bitrate="1716" HasBFrames="1">
        <Disposition Default="1" Dub="0" Original="0" Comment="0" Lyrics="0" Karaoke="0" Forced="0" HearingImpaired="0" VisualImpaired="0" CleanEffects="0" AttachedPic="0"/>
        <Metadata key="language" value="und"/>
        <Metadata key="handler_name" value="VideoHandler"/>
      </VideoTrack>
    </VideoTracks>
    <AudioTracks>
      <AudioTrack Id="2" Codec="aac" CodecLongName="AAC (Advanced Audio Coding)" TimeBase="1/44100" NumberOfFrames="11734" StartTime="PT0S" Duration="PT4M32.463S" SampleFormat="fltp" ChannelLayout="stereo" Channels="2" SamplingRate="44100" Bitrate="125" BitsPerSample="0">
        <Disposition Default="1" Dub="0" Original="0" Comment="0" Lyrics="0" Karaoke="0" Forced="0" HearingImpaired="0" VisualImpaired="0" CleanEffects="0" AttachedPic="0"/>
        <Metadata key="language" value="und"/>
        <Metadata key="handler_name" value="SoundHandler"/>
      </AudioTrack>
    </AudioTracks>
    <Metadata key="major_brand" value="isom"/>
    <Metadata key="minor_version" value="512"/>
    <Metadata key="compatible_brands" value="isomiso2avc1mp41"/>
    <Metadata key="encoder" value="Lavf56.40.101"/>
  </AssetFile>
</AssetFiles>

That has the information I am looking for!  We can consume it now:

XDocument doc = XDocument.Load(xmlUri.ToString());

Dictionary<XName, string> videoAttributes = doc.Descendants().First(element => element.Name.LocalName == "VideoTrack").Attributes().ToDictionary(attribute => attribute.Name, attribute => attribute.Value);

int width = Convert.ToInt32(videoAttributes["Width"]);

int height = Convert.ToInt32(videoAttributes["Height"]);

int bitrate = Convert.ToInt32(videoAttributes["Bitrate"]);

Dictionary<XName, string> audioAttributes = doc.Descendants().First(element => element.Name.LocalName == "AudioTrack").Attributes().ToDictionary(attribute => attribute.Name, attribute => attribute.Value);

int channels = Convert.ToInt32(audioAttributes["Channels"]);

or pull anything else out of that file that you want.

In my post about creating a custom bitrate ladder (see Creating a custom bitrate ladder from AMS), we need that information BEFORE we transcode so we can set the ceiling of our bitrate ladder.

I spoke with David Bristol from Microsoft about this (his blog has a bunch of great AMS related information David Bristol’s Media blog), and he agrees that something like the output of Media Info or FFMPEG’s ffprobe.exe would be great to run on an uploaded asset and is suggesting it to the AMS team, so hopefully we will see this kind of functionality in the future.  40 seconds isn’t great, so I hope we can improve on that, but this might get you to the dance for now.

Leave a Comment

Creating a custom bitrate ladder from Azure Media Services Transcoding

When submitting a transcoding job to Azure Media Services with Media Encoder Standard, the documentation will tell you to use one of the provided presets like this:

string configuration = File.ReadAllText(@"c:\supportFiles\preset.json"); // Create a task

ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task", processor, configuration, TaskOptions.None);

//https://docs.microsoft.com/en-us/azure/media-services/media-services-mes-presets-overview

or by Adaptive Streaming by adding a task like this:

ITask task = job.Tasks.AddNew("My encoding task", processor, "Adaptive Streaming", TaskOptions.None);

In the first example, you are creating multi-bitrate mp4s all the way up to 1080, or even 4k if that is the preset you selected.  In the latter example, what this is doing under the covers is great; You are telling AMS to create the bitrate ladder on the fly based on the input, and to let Microsoft work its magic.  But there are limitations to using Adaptive Streaming from C#, one being that you can’t add thumbnails in the same job, for example.

So what if you want a little more control?  I’ve created a fluent interface for creating your own presets and creating a bitrate ladder that doesn’t “up-encode” based on the quality of the original video.

First, we need to define an EncodingPreset class that will eventually be converted to JSON in valid MES preset format:

public class EncodingPreset
     {
         /// <inheritdoc />
         private EncodingPreset()
         {
             Codecs = new List<Codec>();
             Outputs = new List<Output>();
         }

        public double Version { get; set; }
         public List<Codec> Codecs { get; set; }
         public List<Output> Outputs { get; set; }

        public static EncodingPreset GetBaseEncodingPreset()
         {
             var preset = new EncodingPreset
                          {
                              Version = 1.0d
                          };

            preset.Codecs.Add(Codec.GetH264Codec());
             preset.Outputs.Add(Output.GetMp4Output());

            return preset;
         }

        public EncodingPreset AddNormalAudio()
         {
             Codec codec = Codecs.FirstOrDefault(c => c.Type == "AACAudio");
             if (codec == null)
             {
                 Codec audioCodec = Codec.GetNormalAudioCodec();

                Codecs.Add(audioCodec);
             }

            return this;
         }

        public EncodingPreset AddHDAudio()
         {
             Codec codec = Codecs.FirstOrDefault(c => c.Type == "AACAudio");
             if (codec == null)
             {
                 Codec audioCodec = Codec.GetHDAudioCodec();
                 Codecs.Add(audioCodec);
             }

            return this;
         }

        public EncodingPreset AddBitrateLadder(int width, int height, int bitrate)
         {
             IList<ResolutionInfo> orderedLadder = BitrateLadder.OrderedLadder; //lowest to highest resolution
             int originalPixels = width * height;
             var bitrateTolerance = .05;

            var layersToGenerate = new List<ResolutionInfo>
                                    {
                                        new ResolutionInfo // add the original
                                        {
                                            Width = width,
                                            Height = height,
                                            Bitrate = bitrate
                                        }
                                    };
             foreach (ResolutionInfo step in orderedLadder)
             {
                 if (step.Pixels <= originalPixels)
                 {
                     int min = Math.Min(step.Bitrate, bitrate);
                     layersToGenerate.Add(new ResolutionInfo
                                          {
                                              Width = step.Width,
                                              Height = step.Height,
                                              Bitrate = min
                                          });
                 }
             }

            // make the bitrates distinct - not sure i like this
             List<ResolutionInfo> orderedLayersToGenerate = layersToGenerate.OrderBy(info => info.Pixels).ThenBy(info => info.Bitrate).ToList();
             for (var i = 0; i < orderedLayersToGenerate.Count - 1; i++)
             {
                 foreach (ResolutionInfo layerToGenerate in orderedLayersToGenerate.Where(layerToGenerate => orderedLayersToGenerate.Any(info => info.Bitrate == layerToGenerate.Bitrate && info.Pixels != layerToGenerate.Pixels)))
                 {
                     layerToGenerate.Bitrate = layerToGenerate.Bitrate - 1;
                 }
             }

            foreach (ResolutionInfo layerToGenerate in orderedLayersToGenerate.Where(layerToGenerate => !HasExistingStepWithinTolerance(layerToGenerate.Width, layerToGenerate.Height, layerToGenerate.Bitrate, bitrateTolerance)))
             {
                 AddVideoLayer(layerToGenerate.Width, layerToGenerate.Height, layerToGenerate.Bitrate);
             }

            return this;
         }

        private bool HasExistingStepWithinTolerance(int width, int height, int min, double bitrateTolerance)
         {
             Codec codec = Codecs.FirstOrDefault(c => c.Type == "H264Video");
             if (codec == null)
             {
                 return false;
             }
             return codec.H264Layers.Any(layer => layer.Width == width && layer.Height == height && Math.Abs((layer.Bitrate - min) / (double) layer.Bitrate) <= bitrateTolerance);
         }

        public EncodingPreset AddVideoLayer(int width, int height, int bitrate)
         {
             H264Layer h264Layer = H264Layer.GetVideoLayer(width, height, bitrate);
             Codec codec = Codecs.FirstOrDefault(c => c.Type == "H264Video");
             if (codec == null)
             {
                 codec = Codec.GetH264Codec();
                 Codecs.Add(codec);
             }

            if (!codec.H264Layers.Any(layer => layer.Width == width && layer.Height == height && layer.Bitrate == bitrate))
             {
                 codec.H264Layers.Add(h264Layer);
             }

            return this;
         }

        public EncodingPreset AddPngThumbnails()
         {
             Codec codec = Codecs.FirstOrDefault(c => c.Type == "PngImage");
             if (codec == null)
             {
                 PngLayer pngLayer = PngLayer.Get640x360Thumbnail();

                Codec thumbnailCodec = Codec.GetPngThumbnailCodec();
                 thumbnailCodec.Start = "00:00:01";
                 thumbnailCodec.Step = "00:00:01";
                 thumbnailCodec.Range = "00:00:58";
                 thumbnailCodec.Type = "PngImage";
                 thumbnailCodec.PngLayers.Add(pngLayer);

                Codecs.Add(thumbnailCodec);

                Outputs.Add(Output.GetPngThumbnailOutput());
             }

            return this;
         }
     }
}

With supporting classes for the collections and other classes:

 

public class Codec
     {
         private Codec()
         {
         }

        public string KeyFrameInterval { get; set; }
         public List<H264Layer> H264Layers { get; set; }
         public string Type { get; set; }
         public List<PngLayer> PngLayers { get; set; }
         public string Start { get; set; }
         public string Step { get; set; }
         public string Range { get; set; }
         public string Profile { get; set; }
         public int? Channels { get; set; }
         public int? SamplingRate { get; set; }
         public int? Bitrate { get; set; }
         public string Condition { get; set; }

        public static Codec GetH264Codec()
         {
             return new Codec
                    {
                        Type = "H264Video",
                        KeyFrameInterval = "00:00:02",
                        H264Layers = new List<H264Layer>()
                    };
         }

        public static Codec GetNormalAudioCodec()
         {
             return new Codec
                    {
                        Type = "AACAudio",
                        Profile = "AACLC",
                        Channels = 2,
                        SamplingRate = 48000,
                        Bitrate = 128,
                        Condition = "InsertSilenceIfNoAudio"
                    };
         }

        public static Codec GetHDAudioCodec()
         {
             return new Codec
                    {
                        Type = "AACAudio",
                        Profile = "AACLC",
                        Channels = 6,
                        SamplingRate = 48000,
                        Bitrate = 384,
                        Condition = "InsertSilenceIfNoAudio"
                    };
         }

        public static Codec GetPngThumbnailCodec()
         {
             return new Codec
                    {
                        Type = "PngImage",
                        Start = "00:00:01",
                        Step = "00:00:01",
                        Range = "00:00:58",
                        PngLayers = new List<PngLayer>()
                    };
         }
     }

public class Output
  {
      private Output()
      {
      }

     public string FileName { get; set; }
      public Format Format { get; set; }

     public static Output GetMp4Output()
      {
          return new Output
                 {
                     Format = new Format
                              {
                                  Type = "MP4Format"
                              },
                     FileName = "{Basename}_{Width}x{Height}_{VideoBitrate}{Extension}"
                 };
      }

     public static Output GetPngThumbnailOutput()
      {
          return new Output
                 {
                     Format = new Format
                              {
                                  Type = "PngFormat"
                              },
                     FileName = "{Basename}_{Index}{Extension}"
                 };
      }
  }

public class H264Layer
    {
        private H264Layer()
        {
        }

       public string Profile { get; set; }
        public string Level { get; set; }
        public int Bitrate { get; set; }
        public int MaxBitrate { get; set; }
        public string BufferWindow { get; set; }
        public int Width { get; set; }
        public int Height { get; set; }
        public int BFrames { get; set; }
        public int ReferenceFrames { get; set; }
        public bool AdaptiveBFrame { get; set; }
        public string Type { get; set; }
        public string FrameRate { get; set; }

       public static H264Layer GetVideoLayer(int width, int height, int bitrate)
        {
            return new H264Layer
                   {
                       Profile = "Auto",
                       Level = "auto",
                       Bitrate = bitrate,
                       MaxBitrate = bitrate,
                       BufferWindow = "00:00:05",
                       Width = width,
                       Height = height,
                       BFrames = 3,
                       ReferenceFrames = 3,
                       AdaptiveBFrame = true,
                       Type = "H264Layer",
                       FrameRate = "0/1"
                   };
        }
    }

public class PngLayer
    {
        private PngLayer()
        {
        }

       public string Type { get; set; }
        public int Width { get; set; }
        public int Height { get; set; }

       public static PngLayer Get640x360Thumbnail()
        {
            return new PngLayer
                   {
                       Height = 360,
                       Width = 640,
                       Type = "PngLayer"
                   };
        }
    }

public class Format
    {
        public string Type { get; set; }
    }

a class to hold our original video information to compare to our ideal ladder:

public class ResolutionInfo
    {
        public int Width { get; set; }
        public int Height { get; set; }
        public int Bitrate { get; set; }

       public long Pixels
        {
            get
            {
                return Width * Height;
            }
        }
    }

and an extension method to convert to json properly for this case:

public static class EncodingPresetExtensions
     {
         public static string ToJson(this EncodingPreset preset)
         {
             return JsonConvert.SerializeObject(preset,
                                                new JsonSerializerSettings
                                                {
                                                    NullValueHandling = NullValueHandling.Ignore
                                                });
         }
     }

and finally our ideal bitrate ladder:

public static class BitrateLadder
    {
        private static readonly IList<ResolutionInfo> Ladder = new List<ResolutionInfo>();

       static BitrateLadder()
        {
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 20000,
                           Width = 4096,
                           Height = 2304
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 18000,
                           Width = 3840,
                           Height = 2160
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 16000,
                           Width = 3840,
                           Height = 2160
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 14000,
                           Width = 3840,
                           Height = 2160
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 12000,
                           Width = 2560,
                           Height = 1440
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 10000,
                           Width = 2560,
                           Height = 1440
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 8000,
                           Width = 2560,
                           Height = 1440
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 6000,
                           Width = 1920,
                           Height = 1080
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 4700,
                           Width = 1920,
                           Height = 1080
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 3400,
                           Width = 1280,
                           Height = 720
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 1500,
                           Width = 960,
                           Height = 540
                       });
            Ladder.Add(
                       new ResolutionInfo
                       {
                           Bitrate = 1000,
                           Width = 640,
                           Height = 360
                       });
        }

       /// <inheritdoc />
        public static IList<ResolutionInfo> OrderedLadder
        {
            get
            {
                return Ladder.OrderBy(pair => pair.Pixels).ThenBy(info => info.Bitrate).ToList();
            }
        }
    }

Note that I have set some defaults in these classes for my particular use case.

So let’s talk about the AddBitrateLadder function:

It takes in the width, height, and bitrate from the origin media file so as not to wastefully, “up-encode” it.  Then, it creates a ladder making the “top” layer the original specs, and steps down from there using our ideal bitrate ladder as a guide.  I should also note that AMS keys off the bitrate, so you can not have 2 different resolutions with the same bitrate, and that is why there is code in that method to merely subtract 1 from each bitrate to make them unique if the original video quality is too low to fit into our specified ladder.  Lastly, it includes a tolerance so that you don’t create 2 layers that are virtually identical.

So now I can use this to generate me a custom bitrate ladder with normal audio and thumbnails, for example:

EncodingPreset.GetBaseEncodingPreset()
.AddNormalAudio()
.AddPngThumbnails()
.AddBitrateLadder(playlistItem.AVFile.Width, playlistItem.AVFile.Height, playlistItem.AVFile.Bitrate);

or HD Audio with no thumbnails:

EncodingPreset.GetBaseEncodingPreset()
.AddHDAudio()
.AddBitrateLadder(playlistItem.AVFile.Width, playlistItem.AVFile.Height, playlistItem.AVFile.Bitrate);]

etc. etc.

And it’s totally testable:

[TestMethod]
       public void CalcLayers1920x1080at266()
       {
           List<H264Layer> layers = CalcLayers(1920, 1080, 266);
           Assert.AreEqual(4, layers.Count);

           H264Layer layer1 = layers[0];
           Assert.AreEqual(263, layer1.Bitrate);
           Assert.AreEqual(360, layer1.Height);
           Assert.AreEqual(640, layer1.Width);

           H264Layer layer2 = layers[1];
           Assert.AreEqual(264, layer2.Bitrate);
           Assert.AreEqual(540, layer2.Height);
           Assert.AreEqual(960, layer2.Width);

           H264Layer layer3 = layers[2];
           Assert.AreEqual(265, layer3.Bitrate);
           Assert.AreEqual(720, layer3.Height);
           Assert.AreEqual(1280, layer3.Width);

           H264Layer layer4 = layers[3];
           Assert.AreEqual(266, layer4.Bitrate);
           Assert.AreEqual(1080, layer4.Height);
           Assert.AreEqual(1920, layer4.Width);
       }

 private static List<H264Layer> CalcLayers(int width, int height, int bitrate)
       {
           EncodingPreset preset1 = EncodingPreset.GetBaseEncodingPreset()
                                                  .AddNormalAudio()
                                                  .AddPngThumbnails()
                                                  .AddBitrateLadder(width, height, bitrate);
           return preset1.Codecs.Where(codec => codec.Type == "H264Video")
                         .SelectMany(codec => codec.H264Layers)
                         .ToList();
       }

Then, when it is time to submit my job, I can:

ITask task = job.Tasks.AddNew(“My encoding task”, processor, myPreset.ToJson(), TaskOptions.None);

Boom! Now we have the power of Adaptive Streaming with the benefit of more control over the ideal ladder, as well as other functions of AMS.

Leave a Comment

Auto scaling Media Reserved Units in Azure Media Services

When you spin up an Azure Media Services instance in Azure, you are prompted with a choice:  How many Media Reserved Units do you want?  and what horsepower do you want behind them?

Well, that exactly does that mean?

Reserving a Unit means that when you submit a job to Media Services, you wont go in a public queue in order for your submitted job to start.  This is important, because if the public queue is busy, it could take quite a while for your job to get picked up.  If you have all the time in the world for your job to complete, this isn’t a big deal, but if you are like me with a customer waiting on the job, speed is a priority.  You can choose from 1-10 reserved units (you can request  more via a support request), and they come at a cost.  Also, when you reserve a unit, it has be a specific speed (S1, S2, or S3).

image

So if you want to have 10 reserved units at all times, and you want S3 so the job completes the fastest that Azure offers, that is 80 cents an hour, and that can add up over time.  I should also note that you can NOT reserve zero S2 or S3 units.  If you want to be in the public pool, it has to be S1.  Therefore, you are 4 cents an hour at the very least if you want to have an immediate response time of your jobs by reserving one S1.  I should also note that if you made a support request to get more than 10 units, when you change the speed of those reserved units, the MaxReservableUnits gets reset to 10, and your support request is essentially lost.  I have spoken with Azure support on this, and while they don’t call it a bug, it is something they are addressing in a future release of AMS.

So, the solution I came up with was to auto scale our units with C#.

When a message is sent to my worker role to work with Azure Media Services, I reserve (currently reserved units + 1) S3 units, and when it is done I decrement one S3 unit.  When I hit 0 units, I set the speed back to S1 (because remember you can only have zero units if you are set to S1)

internal static async Task ReserveMediaEncodingUnit(MediaContextBase context, int amount)
      {
          if (ConfigurationProvider.AutoScaleMRU())
          {
              IEncodingReservedUnit encodingReservedUnit = context.EncodingReservedUnits.FirstOrDefault(); //there is always only one of these (https://github.com/Azure/azure-sdk-for-media-services/blob/dev/test/net/Scenario/EncodingReservedUnitDataTests.cs)
              if (encodingReservedUnit != null)
              {
                   encodingReservedUnit.CurrentReservedUnits = Math.Min(amount,
                                                                       ConfigurationProvider.MaxMRUProvisioned() == 0
                                                                           ? encodingReservedUnit.MaxReservableUnits
                                                                           : ConfigurationProvider.MaxMRUProvisioned());
                   encodingReservedUnit.ReservedUnitType = ReservedUnitType.Premium;
                   await encodingReservedUnit.UpdateAsync();
               }
           }
       }

ConfigurationProvider.MaxMRUProvisioned() is a setting I have that is equal to 10.  I did that because I initially put in the service request to get more than 10, only to find out it gets reset back to 10 if you change the speed.  If Microsoft changes this behavior, I can set my setting to 0 and user their variable MaxReservedUnits, without any code changes.

Deallocating units:

 

 internal static async Task DeallocateMediaEncodingUnit(MediaContextBase context, int amount)
       {
           if (ConfigurationProvider.AutoScaleMRU())
           {
               IEncodingReservedUnit encodingReservedUnit = context.EncodingReservedUnits.FirstOrDefault(); //there is always only one of these (https://github.com/Azure/azure-sdk-for-media-services/blob/dev/test/net/Scenario/EncodingReservedUnitDataTests.cs)

              if (encodingReservedUnit != null)
               {
                   encodingReservedUnit.CurrentReservedUnits = Math.Max(0, amount);
                   encodingReservedUnit.ReservedUnitType = encodingReservedUnit.CurrentReservedUnits == 0
                                                               ? ReservedUnitType.Basic
                                                               : ReservedUnitType.Premium;

                  await encodingReservedUnit.UpdateAsync();
               }
           }
       }

If I hit 0 units I can reset:

 

    private static async Task ResetMediaEncodingUnits(MediaContextBase context)
       {
           if (ConfigurationProvider.AutoScaleMRU())
           {
               IEncodingReservedUnit encodingReservedUnit = context.EncodingReservedUnits.FirstOrDefault(); //there is always only one of these (https://github.com/Azure/azure-sdk-for-media-services/blob/dev/test/net/Scenario/EncodingReservedUnitDataTests.cs)

              if (encodingReservedUnit != null)
               {
                   encodingReservedUnit.CurrentReservedUnits = 0;
                   encodingReservedUnit.ReservedUnitType = ReservedUnitType.Basic;
                   await encodingReservedUnit.UpdateAsync();
               }
           }
       }

So now, when my users aren’t transcoding anything, and my AMS instance is sitting idle, I will incur no cost.  And, when they submit a job, I allocate a unit to avoid going to the public pool and the job gets submitted right away and completed with premium speeds.  I can’t guarantee this hack will work forever; when speaking with MS they told me this code has prompted them to think about how reserved units work in AMS, and may change this behavior in the future.

Happy transcoding!

3 Comments

Automating Publish of UWP Application to the Windows Store

You have published your app to the store, hooray!  Now you want to automate that deployment with VSTS…

If you are having issues with Microsoft’s documentation to accomplish this task, here is a summary of how I got it to work. (I should note that you must have an app in the store manually first for this to work).

1.  Go to your Dev Center Dashboard (https://developer.microsoft.com/en-us/dashboard/apps/overview).

2.  Click the gear icon in the top right hand corner > Manage Users

3.  “Add Azure AD Application” > “New Azure AD Application”

4.  Give it a name, like “Windows Store Connection”, the reply URL and App ID URL can be anything at this stage.  IMPORTANT for it to be part of the Developer Role so that this can publish to the Store.

image

5.  Click Save, and you will be taken to a confirmation page.  Click Manage Users to see your newly created application in the grid with a guid.  Click that one to edit it.

image

6.  Under Keys, click Add New Key.  Make note of the ClientId, Key, and Azure Tenant Id, you will not be able to see that key again after you leave this page. Upon confirmation, click Manage Users to go back to your list, and then click on the Connection again to confirm that it looks like this:

image

Now we move on to the build step.

1.  Add Step Windows Store – Publish

2.  For the Service Endpoint, click New

Name your connection, something like “WindowsStoreConnection”

Windows Store API Url: https://manage.devcenter.microsoft.com

Azure Tenant Id: what you noted from your creation of the Azure AD Application, or you can find it in the portal (https://portal.azure.com), clicking on the Azure Active Directory > Properties > Directory ID

ClientID: what you noted from the creation of the Azure AD Application

Client Secret: your key that you noted from the key creation of the Azure AD Application

 image

3.  Click OK and choose your newly created service endpoint from the dropdown

4.  Application identification method: ID

5.  Application ID: get this from your Dev Center Dashboard > App Management >  App Identity > Store ID

Now, when your run this build step, it will publish your app to the store and poll the service until it is finished (so keep in mind this could consume one of your build agents for up to 3 business days)

image

TL;DR

The MS docs led me to believe that I could create my Azure AD Application from the portal via the App Registrations.  When I did that, I got the dreaded “503 Service Unavailable” error on my build publish step.  The trick was to create the Azure AD Application from the Windows Dev Center, give it Developer permissions, and tie that application back to my Windows Dev Center connection endpoint.

2 Comments

Creating an snk for signing your assemblies with Visual Studio

There are a number of reasons you may (or may not) want to sign your assemblies, but if you do, here is a simple way of doing it in Visual Studio.

  • Create a snk file by opening a Visual Studio command prompt as Administrator <—IMPORTANT (https://msdn.microsoft.com/en-us/library/ms229859%28v=vs.110%29.aspx)
    • sn -k <YOUR SNK FILE NAME>.snk
  • Then create a Public Key
    • sn -p <YOUR SNK FILE NAME>.snk <YOUR PUBLIC KEY FILE NAME>.PublicKey
  • Get your Public Key Token
    • sn -tp <YOUR PUBLIC KEY FILE NAME>.PublicKey
    • this will output your public key and token to the console, make note of it

Full output from the console will look like this:

image

  • Next, go to the properties of the project containing your assembly you want to sign, and click on the Signing tab.
  • Check Sign the assembly
  • Click the dropdown and Browse to the newly created snk file that you created in Step 1

image

The reason you want to note your public key and public key token is for use in your app.config or for InternalsVisibleTo.

For example, if the assembly you have signed needs to be specified in an InternalsVisibleTo in the assemblyinfo file, you would specify it like this:

<assembly: InternalsVisibleTo("MySignedAssemblyName, PublicKey=<font style="background-color: #ffff00">0024000004800000940000000602000000240000525341310004000001000100155b8d9138457a0be37b064f4f0fa70ceb948f08a7855122f1d6fe9cb89e74b68d60853358a061482d5e62423881caf1cf276d82b11a2e6075939181ab9e1c3dadfcf23082b04d15fb5f9ca20da5bc99b29f830e5c5d23ae9d3dee6f609d0980ed8ba584f348d48921055e13e66c987f5c5712e15285235cb649f0a1e65c0bb2</font>") />

Or, if you were referencing the assembly in your app.config for a custom Logging handler using Enterprise Library, it would look like this:

<exceptionTypes>
    <add name="All Exceptions"
               type="System.Exception, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
               postHandlingAction="NotifyRethrow">
        <exceptionHandlers>
            <add
                type="MyCustomLogExceptionHandlerClassName, MySignedCustomLogHandlerAssemblyName, Culture=neutral, PublicKeyToken=<font style="background-color: #ffff00">0b4def2ce7bdd21a</font>"
                name="LogExceptionHandler" />
        </exceptionHandlers>
    </add>
</exceptionTypes>
1 Comment

Strongly Typing TempData in your MVC Application with Extension Methods

As I’ve stated before, and for those that know me, when working with C# I try to use the compiler as often as I can and keep things strongly typed.  When I started working in MVC, I didn’t like the fact that TempData was defined like this:

public class TempDataDictionary : IDictionary<string, object>, ICollection<KeyValuePair<string, object>>, IEnumerable<KeyValuePair<string, object>>, IEnumerable

While TempData and ViewData being potentially valuable things, <string, object>, really?

Here is how I get around that and use the compiler to my advantage with some simple extension methods:

public static class TempDataExtensions
{
    public static T Get<T>(this TempDataDictionary tempData, string key)
    {
        if (tempData[key] is T)
        {
            var tempDataItem = (T)tempData[key];
            return tempDataItem;
        }
        throw new InvalidCastException(string.Format("Temp Data does not contain type {0} for key {1}", typeof(T), key));
    }
 
    public static void Set<T>(this TempDataDictionary tempData, string key, T value)
    {
        tempData[key] = value;
    }
}

So, in your controller, you can Set to TempData and Get from TempData like this:

public ActionResult Index()
{
    TempData.Set("SomeObjectKey", new SomeObject());
    TempData.Set("SomeBoolKey", true);
    TempData.Set("SomeStringKey", "test");
     
    TempData.Get<SomeObject>("SomeObjectKey"); // returns SomeObject
    TempData.Get<bool>("SomeBoolKey"); // returns a boolean true
    TempData.Get<string>("SomeStringKey"); // returns the string "test"
 
    return View();
}

You can also do the same with ViewData:

public static class ViewDataExtensions
{
    public static T Get<T>(this ViewDataDictionary viewData, string key)
    {
        if (viewData[key] is T)
        {
            var viewDataItem = (T)viewData[key];
            return viewDataItem;
        }
        throw new InvalidCastException(string.Format("View Data does not contain type {0} for key {1}", typeof(T), key));
    }
 
    public static void Set<T>(this ViewDataDictionary viewData, string key, T value)
    {
        viewData[key] = value;
    }
}

I know what you are thinking, this doesn’t stop you from setting TempData the <string, object> way, and you are correct.  To get and set using strong types, you have to have the discipline to use these extension methods.  But, with these tools, you can give yourself a fighting chance.

Leave a Comment

Don’t litter your code with stringly typed settings, mkay?

h9cnk

When using C#, I am kinda a strongly typed bigot and like to use the compiler as much as I can. Since practically every application I have ever worked on has had some sort of setting access from a config file, I felt that there had to be a better way.

So, given this config file:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <appSettings>
    <add key="StringSetting" value="filepath"/>
    <add key="BoolSetting" value="true"/>
    <add key="StringListDelimitedSetting" value="one;two;three"/>
  </appSettings>
</configuration>

I don’t want to litter my code with this everywhere:

//BAD
string value = System.Configuration.ConfigurationManager.AppSettings["StringSetting"];
if (value == "SOMETHING")
{
    //do something
}
  
//WORSE?
string boolValue = System.Configuration.ConfigurationManager.AppSettings["BoolSetting"];
if (boolValue == "YES")
{
    //do something
}
  
//PRODUCES STRONG TYPE BUT EVEN MORE CODE
string someOtherBoolValue = System.Configuration.ConfigurationManager.AppSettings["SomeOtherBoolSetting"];
bool strongBoolValue;
if (Boolean.TryParse(someOtherBoolValue, out strongBoolValue))
{
    if (strongBoolValue)
    {
        //do something
    }
}

So, this is what I do to keep my “stringly” typed settings in one place, strongly typed and make them easily accessible in my code:

public static class AppSettingsExtensions
{
    public static string StringSetting(this NameValueCollection settings)
    {
        string setting = settings["StringSetting"];
        if (setting != null && !string.IsNullOrWhiteSpace(setting))
        {
            return setting;
        }
  
        return string.Empty;
    }
  
    public static bool BoolSetting(this NameValueCollection settings)
    {
        string setting = settings["BoolSetting"];
        if (setting != null && !string.IsNullOrWhiteSpace(setting))
        {
            bool test;
            if (Boolean.TryParse(setting, out test))
            {
                return test;
            }
        }
  
        return false;
    }
  
    public static IEnumerable<string> StringListDelimitedSetting(this NameValueCollection settings)
    {
        string setting = settings["StringListDelimitedSetting"];
        if (setting != null && !string.IsNullOrWhiteSpace(setting))
        {
            return setting.Split(Convert.ToChar(";"), Convert.ToChar(",")).ToList();
        }
  
        return Enumerable.Empty<string>();
    }
}

Accessing settings in code now is simple and gives you a strong type:

//GOOD
string stringSetting = ConfigurationManager.AppSettings.StringSetting();
if (stringSetting == "SOMETHING")
{
    //do something
}
 
//OR
bool boolSetting = ConfigurationManager.AppSettings.BoolSetting();
if (boolSetting)
{
    //do something
}
 
//OR
IEnumerable<string> listSettings = ConfigurationManager.AppSettings.StringListDelimitedSetting();
foreach (string setting in listSettings)
{
    //do something
}

And yes, this works for connections strings as well, just change the type of the extension:

public static class ConnectionStringExtensions
{
    public static string SomeConnectionString(this ConnectionStringSettingsCollection settings)
    {
        ConnectionStringSettings setting = settings["SomeConnectionString"];
        if (setting != null)
        {
            string connectionString = setting.ConnectionString;
            if (!string.IsNullOrWhiteSpace(connectionString))
            {
                return connectionString;
            }
        }
 
        return string.Empty;
    }
}

Accessed like:

string connectionString = ConfigurationManager.ConnectionStrings.SomeConnectionString();

And there you have it, that is a tool I like to keep in my toolbox when working with configuration files.

Leave a Comment

How I missed Codemash 2014 and still learned something

I have to stop having kids around Codemash time, or rather, roughly 9 months before Codemash time.

Every year I attend I usually write a follow up blog post saying how it rejuvenates my love for the profession each year.  I have missed 2 incarnations of Codemash in its history due to my 2 beautiful daughters being born within months of the event (read: WORTH IT), but this year was a little different.  This time I wasn’t able to attend, but I still had that same fire ignited inside me as the years that I was there.

What happened?  Well two things.

1.  I was able to send a young and talented developer in my place that had never been.

2.  The twitter feed.

Let’s start with #1.  I found that ”prepping” someone that was as excited to go as I was the first time made me feel like I was there again.  Make sure you do this, listen to this speaker he/she is great, participate in an open space, eat lots of bacon (he was paleo, so he liked this part), MEET PEOPLE, etc. etc.  In the week upcoming the event, we talked about it everyday.

Secondly, I probably have never used twitter as much as that week, trying to follow along with the events.  Since I was home with the baby and up a lot, I was spending a lot of time at night working on the website for my new business.  I was challenging myself to step out of my .NET comfort zone and build something solely with HTML and JavaScript (maybe it was Codemash from a distance inspiring me).  The problem that I was having was that my first cut at it had a lot of HTML that I was repeating across pages (navigation bar, header, footer, images, etc.) , and I thought there has to be a better way .  My first thought was MVC to deliver a base layout and render a body dynamically, but that broke my rule of trying something new, so that was out.  In comes the Codemash twitter feed where I was hearing a lot about AngularJS, specifically from my friend that went in my place who had attended a class on it and was tweeting about it.  The lights went on.  That will handle my situation AND be something new (but still kinda comfortable since I was used to the controller concept of MVC!).  A couple of nights of reading and doing and viola!, Holeshot Software was finally out there.  Is it spectacular? not really, but it was something I had been wanting to do, it challenged me, and I succeeded, and my business has a landing page with my contact information.

So, all of the reasons that love Codemash so much were still present, even in its absence, including the interaction with the most important  part, the people (only this time it was strictly via Twitter).

Unless something happens around April again, I will see you all next year!

Leave a Comment

Opinions are like Utility.dll, everybody’s got one…

You know what I ‘m talking about.  You want to put that method that sends an email, or that Key/Key Pair dictionary thing the you stole from Jon Skeet’s blog somewhere for all of your colleagues to use and bask in your reusable API glory.  The question is, where do you put it?  On a server share?  Checked in to source control?  On a mapped drive that everyone has?  I suppose, but how are you going to version it?  How are you going to let the people referencing upgrade?  How are you going to handle breaking changes?  Dependency Management?  Well, here is my opinion: Nuget to the rescue.  If you are sharing any code in your organization, hosting an internal Nuget feed is a great way to do that.  Get this set up early and have your CI build continue to publish packages and you can add and share code as quickly as your build server can build it (and run the tests of course).

I have used the integrated Nuget server in TeamCity before, and I have to admit, it’s pretty awesome (minus managing packages, as of this date that still sucks since you can’t issue Nuget.exe commands to it).  For the purposes of this post, however, for those that aren’t running TeamCity or want a process that isn’t married to a third party, we will talk about setting one up from scratch.

Before we get in to putting the package somewhere, lets go over creating the package in the first place.  So, you have your assembly:

image

The first thing you need to do is create a nuspec file.  You can generate this by “nuget spec” in the same folder as your csproj (I just put a copy of Nuget.exe in the same folder to make it easy):

image

image

All that does is generate a nuspec file for you, but if you ask me, it’s pretty weak in terms of actually doing anything for you specific to this project.  Maybe its more useful if as part of your build process you are generating a new nuspec file every time and using this text as wildcards to what you would replace.  Honestly, for now, you can just copy this and and start from there:

<?xml version="1.0"?>
<package >
  <metadata>
    <id>$id$</id>
    <version>$version$</version>
    <title>$title$</title>
    <authors>$author$</authors>
    <owners>$author$</owners>
    <licenseUrl>http://LICENSE_URL_HERE_OR_DELETE_THIS_LINE</licenseUrl>
    <projectUrl>http://PROJECT_URL_HERE_OR_DELETE_THIS_LINE</projectUrl>
    <iconUrl>http://ICON_URL_HERE_OR_DELETE_THIS_LINE</iconUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>$description$</description>
    <releaseNotes>Summary of changes made in this release of the package.</releaseNotes>
    <copyright>Copyright 2013</copyright>
    <tags>Tag1 Tag2</tags>
  </metadata>
</package>

Those $variable$ replacement tokens come from your AssemblyInfo.cs, so make sure that it populated, or just hard code the values instead (we will pass the version into the pack command):

image

Also, IT IS IMPORTANT THE THE NUSPEC FILE IS THE SAME NAME AS THE CSPROJ.  That will come in to play when we package this sucker.  It is a convention thing, and it bit me pretty good when I tried to do this for the first time.

Time to pack it up.  There are MSBuild targets for this as well, but since we have that Nuget.exe right there, we can call “nuget pack” (if you don’t have that, you can pass it a csproj after the word ‘pack’).  In this case we are going to use a few of the optional command line arguments to 1. build the package before packaging 2. Produce debug symbols 3.  Build in release mode 4. Specify a version.  You can specify an OutputDirectory, but it will use the current directory if you don’t specify.

So, “nuget pack –Build –Symbols -Properties Configuration=Release –Version 1.0.0.0”

image

You will have some warnings that you haven’t filled everything out properly, and you can take care of that if you’d like.

Viola!  We have a package! (and symbols, which is great for an internal Nuget feed since you most likely own the code and will want to debug)

image

Now, where to put it?

Let’s start with setting up an internal server. There are 2 ways to do this:

1. a network share (simple, but may have some performance and security complications)

2. a remote feed through an IIS website (probably best, but has a little bit more of a startup effort to implement)

#1 – A Network Share

Create a share somewhere and put your packages in it.

In Visual Studio – Go to Tools > Options > Package Manager > Package Sources.  Add a name and a UNC share location:

image

Now you can consume it in your project that needs it.  First things first, make sure you have this checked:

image

Then you can right click on your project, and Manage Nuget Packages:

imageimage

When you install, you will get a reference to Holeshot.Utility, a packages.config, and a .nuget folder (if you have Restore Nuget Packages on, which I think you will).  Open the NuGet.targets and make sure RestorePackages is true and DownloadNuGetExe is true.  Make sure that the Nuget.exe is NOT checked in to source control, as it will download a new copy every time if it needs to.

imageimage

There you have it, the next time a new Holeshot.Utility is put in the network share, your Holeshot.ProjectThatNeedsUtility will notify you that there has been an update to your reference, and you have the option to take the new version.  Herein lies one of the biggest advantages of this process.  It puts the product owners back in charge.  Don’t want to take the upgrade now because the Minor version changed and that indicates a possible breaking change?  Then don’t take it.  When you are ready, take it, correct any compilation errors (if any), run your tests, and you are in business with a potentially effortless upgrade.

image

image

#2 – A remote feed in IIS

If you have a build server, TFS server, or some other computer that is publicly accessible, that will do the trick.  It’s not like it needs to be a beefy machine.

First, create an Empty ASP.NET Web Application, and install Nuget.Server from the Manage Nuget Packages console.  Notice how it resolves all of its own dependencies (Elmah, Ninject, about 25 others, etc.).  You can have that too in your packages using this process and specifying dependencies in your nuspec file, but that is another post.

image

Now you will have this.  Notice the packages folder, that is where your packages need to go now.

image

Go to the web.config and specify an API key:

image

Publish and browse to the web site:

image

Now we can either push to the feed, or just copy the packages to the folder specified above:

image

BAM!

image

There you have it.  I did all of this while documenting in a matter of 2 hours.  Well worth the effort if you ask me.

Leave a Comment