Geo Location & Spatial Searches with RavenDB–Part III-Importing

time to read 7 min | 1264 words

The following are sample from the data sources that MaxMind provides for us:

image

image

The question is, how do we load them into RavenDB?

Just to give you some numbers, there are 1.87 million blocks and over 350,000 locations.

Those are big numbers, but still small enough that we can work with the entire thing in memory. I wrote a quick & ugly parsing routines for them:

public static IEnumerable<Tuple<int, IpRange>> ReadBlocks(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Blocks.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[2]),
                new IpRange
                {
                    Start = long.Parse(entries[0]),
                    End = long.Parse(entries[1]),
                });
        }
    }
}

public static IEnumerable<Tuple<int, Location>> ReadLocations(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Location.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[0]),
                new Location
                {
                    Country = NullIfEmpty(entries[1]),
                    Region = NullIfEmpty(entries[2]),
                    City = NullIfEmpty(entries[3]),
                    PostalCode = NullIfEmpty(entries[4]),
                    Latitude = double.Parse(entries[5]),
                    Longitude = double.Parse(entries[6]),
                    MetroCode = NullIfEmpty(entries[7]),
                    AreaCode = NullIfEmpty(entries[8])
                });
        }
    }
}

private static string NullIfEmpty(string s)
{
    return string.IsNullOrWhiteSpace(s) ? null : s;
}

And then it was a matter of bringing it all together:

var blocks = from blockTuple in ReadBlocks(dir)
             group blockTuple by blockTuple.Item1
             into g
             select new
             {
                 LocId = g.Key,
                 Ranges = g.Select(x => x.Item2).ToArray()
             };

var results =
    from locTuple in ReadLocations(dir)
    join block in blocks on locTuple.Item1 equals block.LocId into joined
    from joinedBlock in joined.DefaultIfEmpty()
    let _ = locTuple.Item2.Ranges = (joinedBlock == null ? new IpRange[0] : joinedBlock.Ranges)
    select locTuple.Item2;

 

The advantage of doing things this way is that we only have to write to RavenDB once, because we merged the results in memory. That is why I said that those are big results, but still small enough for us to be able to process them easily in memory.

Finally, we wrote them to RavenDB in batches of 1024 items.

The entire process took about 3 minutes and wrote 353,224 documents to RavenDB, which include all of the 1.87 million ip blocks in a format that is easy to search through.

In our next post, we will discuss actually doing searches on this information.