Strangest support calls for RavenDB
We have a support hotline for RavenDB, and usually we get people that give us “good” problems to solve. And then we have the people who… don’t.
The following are a few of the strangest issues from the past month or so.
- OutOfMemoryException is thrown when running RavenDB in 32 bits mode, and documents of size of 50-70MB size are used.
Solution – When running in 32 bits mode, RavenDB only have 2GB of virtual memory to work with, and that is really not enough to do much. There is no reason today for any server app to run in 32bits mode. Also, a 70MB document?! Seriously!
- Very slow startup time for RavenDB when the number of indexes approaches 20,000.
Solution – That isn’t a typo, honest. We had a customer that sent us a database that had 19,273 indexes in it. When they restarted the database, we had to load all of those indexes, and that took a… while. And no, those weren’t dynamic indexes, they were static indexes that (I hope, at least) were probably generate by a tool.
- Index creation can take a long time when the number of map indexes in a multi map index is higher than 150.
Solution – Are you trying to be funny?! What it is that you are doing?
- Index creation can take a long time when the size of the index definition is greater than 16KB.
Solution – that is a single index definition that goes on for roughly 3,000 lines. You are doing things wrong.
What is the worst thing that you have seen?
Comments
In one case we have approx. 30 maps in a map reduce index, which covers 150,000 documents. Indexing does take a very long time (hours to rebuild).
32Bit? are you serious? why do you still support it?
a database support 32bit is like modern websites support ie6 :)
this is a database! take the entire ram and just use as much as possible
Blog posts where you make fun of the mistakes customers make using your product. That's the worst. ;)
Talk and work on solutions that prevent your customers from making these mistakes. Don't make them look like they are fools.
OutOfMemoryException - Why is it possible to run with this configuration in the first place. Don't start when the requirements are not sufficient. 20k Indexes - Actively warn the user when the amount of indexes exceed a certain amount and the effects it could cause, or don't allow the creation of that amount of indexes (safe by default?) unless specific configuration parameters are overridden. Map indexes in multi map index - Same, prevent bad usage by limiting bad usage. 16kb index - Same, prevent the use of such large indexes unless overridden.
@Micha, I don't see any problem in this post. there are no names and all the people are anonymous.
further more, you can't prevent any bad practice.
@Uri, since there's RavenDB.Embedded, which can run on smaller footprints than your standard server, supporting 32bits makes a bit more sense, I think.
April Fools is a great time to do this. :-)
One of my faves with RavenDB was the guy in the Google Group who was having a perf problem with transformers. He posted his transformer:
https://10137960102884337581.googlegroups.com/attach/231ec7741ec10bb4/Transformer.txt?part=0.1&view=1&vt=ANaJVrEwtbaWlpBnf_BVX06p9TpQqOEd8AzUb0_ap-X_V2MeYM0E9aexC2S7aclfe_rrDvfIHOMy2um5D0mgxW6tp0f8Gi-N4CzY3NBe3W-M58XBsjSif5M
I'm sure you've considered it, but keeping with the "safe by default" mantra, maybe Raven should have some built-in ways to prevent abuse like this.
Documents shouldn't be larger than some MB.
No more than N indexes per database.
At most N maps per multimap index.
...Block these, or alternately, issue a warning in the Studio when these conditions are detected.
Fixed link for out-of-control transformer: http://bit.ly/1CPEK4t
Murphy's law of software development: Your users will always find way to do whatever you don't want them to do, and they will do it.
Even if there are no customer's names here, they probably read your blog and now know that you poke fun at them. A pretty unprofessional post.
@Judah - WOW!!!! That's just crazy. Reading through it a little I can see salon, fence, green house, etc... Was he trying to model some simulation game?
I don't know what a BORETUM is though...
@Ted
No, at some point we all need to learn. Especially the "you're doing it wrong" lessons need to be learned.
This post has no naming and shaming in it, and there will always be customers that do wrong things, and they need to know that.
You can either treat your customers like princess and let them live in their ignorance or be as open as possible to them even if they might not like it. I prefer the second option (even as a customer).
@Catalin
Or we could treat them with some respect. The "Hey, look at all this stupid", which is exactly the tone of this post is pretty poor form.
Micha, Can you point to "a customer" that was named here? This is a post that is meant to point out uncommon mistakes and have people be aware of them. And the intention was to follow April 1st spirit.
As for allowing those errors. We support 32 bits, and we are running great with normal document sizes. With 70 MB documents, that is a bit hard, but we can't just disallow it. In particular, placing limits, even very high ones, lead to users really upset with us. See the complaints about paging limits in RavenDB.
Users sometimes have good reasons to do strange things, and it isn't nice to put limits there. You might actually have a good reason for a very big index, and if you are willing to accept the price, why not?
Judah, We really considered this, yes. But it is a problem, for several reasons. Adding limitations after the fact is a breaking change, so that require a major version upgrade. But users that run into those limitations are usually quite upset, see what happened with paging.
Rather than limits, perhaps warnings in the studio dashboard with explanations.
"This is an unusual number of indexes. Most systems would have between...."
I'm with Kijana, inform and provide insight. This is similar to how resharper notices potential problems and has "why is resharper suggesting this" with a link to their documentation.
Even as a very experienced developer i repeatedly find myself looking at resharper's hinting over "implicitly captured closure". I repeatedly interrogate this for how imperative it is to really understand and think through the implications of where it is, rather than just assume.
Comment preview