Ayende @ Rahien

Hi!
My name is Ayende Rahien
Founder of Hibernating Rhinos LTD and RavenDB.
You can reach me by phone or email:

ayende@ayende.com

+972 52-548-6969

@

Posts: 5,947 | Comments: 44,543

filter by tags archive

Geo Location & Spatial Searches with RavenDB–Part III-Importing


The following are sample from the data sources that MaxMind provides for us:

image

image

The question is, how do we load them into RavenDB?

Just to give you some numbers, there are 1.87 million blocks and over 350,000 locations.

Those are big numbers, but still small enough that we can work with the entire thing in memory. I wrote a quick & ugly parsing routines for them:

public static IEnumerable<Tuple<int, IpRange>> ReadBlocks(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Blocks.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[2]),
                new IpRange
                {
                    Start = long.Parse(entries[0]),
                    End = long.Parse(entries[1]),
                });
        }
    }
}

public static IEnumerable<Tuple<int, Location>> ReadLocations(string dir)
{
    using (var file = File.OpenRead(Path.Combine(dir, "GeoLiteCity-Location.csv")))
    using (var reader = new StreamReader(file))
    {
        reader.ReadLine(); // copy right
        reader.ReadLine(); // header

        string line;
        while ((line = reader.ReadLine()) != null)
        {
            var entries = line.Split(',').Select(x => x.Trim('"')).ToArray();
            yield return Tuple.Create(
                int.Parse(entries[0]),
                new Location
                {
                    Country = NullIfEmpty(entries[1]),
                    Region = NullIfEmpty(entries[2]),
                    City = NullIfEmpty(entries[3]),
                    PostalCode = NullIfEmpty(entries[4]),
                    Latitude = double.Parse(entries[5]),
                    Longitude = double.Parse(entries[6]),
                    MetroCode = NullIfEmpty(entries[7]),
                    AreaCode = NullIfEmpty(entries[8])
                });
        }
    }
}

private static string NullIfEmpty(string s)
{
    return string.IsNullOrWhiteSpace(s) ? null : s;
}

And then it was a matter of bringing it all together:

var blocks = from blockTuple in ReadBlocks(dir)
             group blockTuple by blockTuple.Item1
             into g
             select new
             {
                 LocId = g.Key,
                 Ranges = g.Select(x => x.Item2).ToArray()
             };

var results =
    from locTuple in ReadLocations(dir)
    join block in blocks on locTuple.Item1 equals block.LocId into joined
    from joinedBlock in joined.DefaultIfEmpty()
    let _ = locTuple.Item2.Ranges = (joinedBlock == null ? new IpRange[0] : joinedBlock.Ranges)
    select locTuple.Item2;

 

The advantage of doing things this way is that we only have to write to RavenDB once, because we merged the results in memory. That is why I said that those are big results, but still small enough for us to be able to process them easily in memory.

Finally, we wrote them to RavenDB in batches of 1024 items.

The entire process took about 3 minutes and wrote 353,224 documents to RavenDB, which include all of the 1.87 million ip blocks in a format that is easy to search through.

In our next post, we will discuss actually doing searches on this information.


Comments

Richard

Hi,

what hardware setup did you use for this? As I am trying to do this on my machine, which is a i7-870 quad core with 8Gb mem. It took at least 30 mins to write a 100k locations. I have the server running on the same machine as the client app. I would be interested to see the actual code for writing to the server. For now I use skip/take 1024 items and store each item, finishing with a savechanges. For each batch I use a new session, else I get a out-of-memory exception...

Cheers,

Richard

Actually took me 01:25:10 for importing all locations, i must be doing something wrong...

Comment preview

Comments have been closed on this topic.

FUTURE POSTS

No future posts left, oh my!

RECENT SERIES

  1. RavenDB Sharding (3):
    22 May 2015 - Adding a new shard to an existing cluster, splitting the shard
  2. The RavenDB Comic Strip (2):
    20 May 2015 - Part II – a team in trouble!
  3. Challenge (45):
    28 Apr 2015 - What is the meaning of this change?
  4. Interview question (2):
    30 Mar 2015 - fix the index
  5. Excerpts from the RavenDB Performance team report (20):
    20 Feb 2015 - Optimizing Compare – The circle of life (a post-mortem)
View all series

RECENT COMMENTS

Syndication

Main feed Feed Stats
Comments feed   Comments Feed Stats