Ayende @ Rahien

My name is Oren Eini
Founder of Hibernating Rhinos LTD and RavenDB.
You can reach me by phone or email:


+972 52-548-6969

, @ Q c

Posts: 6,026 | Comments: 44,842

filter by tags archive

RavenDB, Victory

time to read 2 min | 299 words

Jeremy Miller’s post about Would I use RavenDB again has been making the round. It is a good post, and I was asked to comment on it by multiple people.

I wanted to comment very briefly on some of the issues that were brought up:

  • Memory consumption – this is probably mostly related to the long term session usage, which we expect to be much more short lived.
    • The 2nd level cache is mostly there to speed things up when you have relatively small documents. If you have very large documents, or routinely have requests that return many documents, that can be a memory hog. That said, the 2nd level cache is limited to 2,048 items by default, so that shouldn’t really be a big issue. And you can change that (or even turn it off) with ease.
  • Don’t abstract RavenDB too much – yeah, that is pretty much has been our recommendation for a while.
    • I don’t see this as a problem. You have just the same issue if you are using any OR/M against an RDBMS.
  • Bulk Insert – the issue has already been fixed. In fact, IIRC, it was fixed within a day or two of the issue being brought up.
  • Eventual Consistency – Yes, you need to decide how to handle that. As Jeremy said, there are several ways of handling that, from using natural keys with no query latency associated with them to calling WaitForNonStaleResultsAsOfNow();

Truthfully, the thing that really caught my eye wasn’t Jeremy’s post, but one of the comments:


Thanks you, we spend a lot of time on that!


Phillip Haydon

Yup, the documentation is great, the community is great! Can't complain. We get a low volume of users in the JabbR Chatroom for #RavenDB but still fun to answer peoples questions when they stop by!

Pure Krome

Here here!

(and don't forget to hangout in JabbR's #RavenDb room, like @PhillipHaydon said :)

Chris Marisic

I do have to say, as of the current forums it seems that some users are still facing significant difficulty in successfully using bulk insert.

Ayende Rahien

Chris, We found the problem, we think. It is related to the syncronization context used in the code. More specifically, asp.net syncronization contex tseems to be causing a major issue. We identified the issue and have a solution now.

Comment preview

Comments have been closed on this topic.


No future posts left, oh my!


  1. Technical observations from my wife (3):
    13 Nov 2015 - Production issues
  2. Production postmortem (13):
    13 Nov 2015 - The case of the “it is slow on that machine (only)”
  3. Speaking (5):
    09 Nov 2015 - Community talk in Kiev, Ukraine–What does it take to be a good developer
  4. Find the bug (5):
    11 Sep 2015 - The concurrent memory buster
  5. Buffer allocation strategies (3):
    09 Sep 2015 - Bad usage patterns
View all series


Main feed Feed Stats
Comments feed   Comments Feed Stats