Ayende @ Rahien

It's a girl

Geo Location & Spatial Searches with RavenDB–Part III-Importing

The following are sample from the data sources that MaxMind provides for us:

image

image

The question is, how do we load them into RavenDB?

Just to give you some numbers, there are 1.87 million blocks and over 350,000 locations.

Those are big numbers, but still small enough that we can work with the entire thing in memory. I wrote a quick & ugly parsing routines for them:

public static IEnumerable<Tuple<int, IpRange>> ReadBlocks(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Blocks.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[2]),
                new IpRange
                {
                    Start = long.Parse(entries[0]),
                    End = long.Parse(entries[1]),
                });
        }
    }
}

public static IEnumerable<Tuple<int, Location>> ReadLocations(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Location.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[0]),
                new Location
                {
                    Country = NullIfEmpty(entries[1]),
                    Region = NullIfEmpty(entries[2]),
                    City = NullIfEmpty(entries[3]),
                    PostalCode = NullIfEmpty(entries[4]),
                    Latitude = double.Parse(entries[5]),
                    Longitude = double.Parse(entries[6]),
                    MetroCode = NullIfEmpty(entries[7]),
                    AreaCode = NullIfEmpty(entries[8])
                });
        }
    }
}

private static string NullIfEmpty(string s)
{
    return string.IsNullOrWhiteSpace(s) ? null : s;
}

And then it was a matter of bringing it all together:

var blocks = from blockTuple in ReadBlocks(dir)
             group blockTuple by blockTuple.Item1
             into g
             select new
             {
                 LocId = g.Key,
                 Ranges = g.Select(x => x.Item2).ToArray()
             };

var results =
    from locTuple in ReadLocations(dir)
    join block in blocks on locTuple.Item1 equals block.LocId into joined
    from joinedBlock in joined.DefaultIfEmpty()
    let _ = locTuple.Item2.Ranges = (joinedBlock == null ? new IpRange[0] : joinedBlock.Ranges)
    select locTuple.Item2;

 

The advantage of doing things this way is that we only have to write to RavenDB once, because we merged the results in memory. That is why I said that those are big results, but still small enough for us to be able to process them easily in memory.

Finally, we wrote them to RavenDB in batches of 1024 items.

The entire process took about 3 minutes and wrote 353,224 documents to RavenDB, which include all of the 1.87 million ip blocks in a format that is easy to search through.

In our next post, we will discuss actually doing searches on this information.

Tags:

Posted By: Ayende Rahien

Published at

Originally posted at

Comments

Richard
06/21/2012 11:43 AM by
Richard

Hi,

what hardware setup did you use for this? As I am trying to do this on my machine, which is a i7-870 quad core with 8Gb mem. It took at least 30 mins to write a 100k locations. I have the server running on the same machine as the client app. I would be interested to see the actual code for writing to the server. For now I use skip/take 1024 items and store each item, finishing with a savechanges. For each batch I use a new session, else I get a out-of-memory exception...

Cheers,

Richard
06/21/2012 12:38 PM by
Richard

Actually took me 01:25:10 for importing all locations, i must be doing something wrong...

Comments have been closed on this topic.