Oren Eini

CEO of RavenDB

a NoSQL Open Source Document Database

Get in touch with me:

oren@ravendb.net +972 52-548-6969

Posts: 7,520
|
Comments: 51,141
Privacy Policy · Terms
filter by tags archive
time to read 2 min | 290 words

This post from Sam Gentile has made me realize that I need to clarify a few things about the recent TFS vs. XYZ discussion.

This post isn't really aimed at him [that would be me] but I do find a post by him seeming to suggest that you only can use OSS tools to be "Agile" to be, well, quite disappointing

I guess that I didn't notice the conversation digress, but I really should clarify that it has never been my intention to suggest, or seem to suggest, any such thing. Tools helps, and I think that they are important. I have my own preferences, based on my experiance and the way I like to work.

I am mostly talking about OSS tools because I am familiar with them and it makes it easy to point out and show. The best project management system that I have worked with is JIRA, but Trac does quite a bit of the work, and doesn't require as complex a setup.

There is no correlation or dependency between the tools that you use and the way that you work. And you most assuredly can use TFS to be agile.

I do think that tools can be of significant help, you can certainly be agile without them, but it is easier with them. My issues with TFS has nothing to do with agility, and everything to do with seamless usage, a whole seperate issue alltogether.

 

time to read 26 min | 5054 words

One of the more difficult regions to test in an application is the data access layer. It is difficult to test for several reasons:

  • It is usually complicated - fetching data effectively is not something trivial in many cases.
  • It can be highly dependant on the platform you are using, and moving between platforms can be a PITA.
  • It is usually hard to mock effectively.
  • Database by their natures keep state, tests should be isolated.
  • It is slow - we are talking out of process calls at best, remote system calls at worst.

I am a big fan of NHibernate, and I consider myself fairly proficent in mocking, and I find it very hard to mock data access code effectively, and that is when NHibernate already provides a very easy set of interfaces to work with.

The issue is not with calls like this:

Post post = (Post)session.Load(typeof(Post),1);

This is very easy to mock.

The issue is with calls like this one:

Post post = (Post)session.CreateCriteria(typeof(Post))
     .Add(Expresion.Like("Title", title))
     .AddOrder(Order.Asc("PublishedAt"))
     .SetFirstResult(0)
     .SetMaxResult(1)
     .UniqueResult();

Trying to mock that is going to be... painful. And this is a relatively simple query. What is worse are a series of queries, which work together to return a common result. When my setup code crossed the 500 lines of highly recursive mocking just to give the test a reasonable place to work with, I knew that I had an issue.

I could break it up to more rigid interface, but that completely ignore the point of being flexible. The above query is hard coded, but pretty often I find myself building those dynamically, which is not possible using rigid (but more easily mockable) interfaces.Please note that I am not talking about the feasability of mocking those, I have done it, it is possible, if lenghty, I am talking about maintainability and the ability to read what was the intention after six months has passed. Bummer, isn't it?

Then I thought about SQLite. SQLite, despite their documnetations shortcoming, is a lightwieght database engine that supports an in memory database. What is more, NHibernate already supports it natively (which saved me the effort :-) ). SQLite is an in-process database, and in-memory databases are wiped when their connections are closed. So far we removed two major obstacles, the statefulness of the databasess, and the inherent slowdowns we are going across process/machine boundaries. In fact, since we are using entirely in-memory database, we don't even touch the file system :-).

But we have the issue of moving between platforms. We can't just port the database to SQLite just for testing. Or can we?

First, let us define what we are talking about. I am not going to performance testing (except maybe SELECT N+1 issues) on SQLite, this require the production (or staging) database with a set of tools to analyze and optimize what we are doing.

So, if we ruled perf testing from the set of scenarios we are looking for, we don't need large amounts of data. NHibernate will create a schema for us, free of charge, and it can handle several databases transperantly. We don't need to mock anything, it looks like we are golden.

I got the following code:

[TestFixture]

public class InMemoryTests : NHibernateInMemoryTestFixtureBase

{

      private ISession session;

 

      [TestFixtureSetUp]

      public void OneTimeTestInitialize()

      {

            OneTimeInitalize(typeof(SMS).Assembly);

      }

     

      [SetUp]

      public void TestInitialize()

      {

            session = this.CreateSession();

      }

 

      [TearDown]

      public void TestCleanup()

      {

             session.Dispose();

      }

      [Test]

      public void CanSaveAndLoadSMS()

      {

            SMS sms = new SMS();

            sms.Message = "R U There?";

            session.Save(sms);

            session.Flush();

           

            session.Evict(sms);//remove from session cache

           

            SMS loaded = session.Load<SMS>(sms.Id);

            Assert.AreEqual(sms.Message, loaded.Message);

      }

}

Initialize the framework, create a session, and run. Notice that the test doesn't care what it is working against. It just test that we can test/load an entity. Let us look at the base class:

public class NHibernateInMemoryTestFixtureBase

{

      protected static ISessionFactory sessionFactory;

      protected static Configuration configuration;

 

      /// <summary>

      /// Initialize NHibernate and builds a session factory

      /// Note, this is a costly call so it will be executed only one.

      /// </summary>

      public static void OneTimeInitalize(params Assembly [] assemblies)

      {

            if(sessionFactory!=null)

                  return;

            Hashtable properties = new Hashtable();

            properties.Add("hibernate.connection.driver_class", "NHibernate.Driver.SQLite20Driver");

            properties.Add("hibernate.dialect", "NHibernate.Dialect.SQLiteDialect");

            properties.Add("hibernate.connection.provider", "NHibernate.Connection.DriverConnectionProvider");

            properties.Add("hibernate.connection.connection_string", "Data Source=:memory:;Version=3;New=True;");

 

            configuration = new Configuration();

            configuration.Properties = properties;

            foreach (Assembly assembly in assemblies)

            {

                  configuration = configuration.AddAssembly(assembly);

            }

            sessionFactory = configuration.BuildSessionFactory();

      }

     

      public ISession CreateSession()

      {

            ISession openSession = sessionFactory.OpenSession();

            IDbConnection connection = openSession.Connection;

            new SchemaExport(configuration).Execute(false,true,false,true,connection,null);

            return openSession;

      }

}

Here we just initialize NHibernate with an in memory connection string and a SQLite provider. Then, when we need to grab a session, we make sure to initialize the database with our schema. Disposing the session closes the connection, which frees the database.

So far we handled the following issues: Slow, Stateful, Hard to mock, platform dependant. We have seen that none of them apply to the issue at hand. Now, what about the last one, testing complicate data fetching strategies?

Well, that is what we do here, aren't we? In this case, true, we aren't doing any queries, but it is the prinicpal that matters. Looking at a database through NHibernate tinted glasses, they look pretty much the same. And a querying strategy that works on one should certainly work on another (with some obvious exceptions). I am much more concerend about getting the correct data than how I get it.

The beauty here is that we don't need to do anything special to make this happen. Just let the tools do their work. To use Active Record with this approach, you need replace the calls to the configuration with calls to ActiveRecordStarter, and that is about it.

Even though those tests execute code from the business logic to the database, they are still unit tests. To take Jeremy's Qualities of a Unit Test as an example, unit tests should be:

  • Atomic - each test gets each own database instance, they can't affect each other.
  • Order indepenent and isolated - same as above, once the test finished, its database is back to the Great Heap in the Sky.
  • Intention Revealing - throw new OutOfScopeException("Issue with test, not the technique");
  • Easy to setup - check above for the initial setup, afterward, it is merely an issue of shoving stuff into the database using NH's facilities, which is very easy, in my opinion.
  • Fast - It is an in memory database, it is fast. For comparison, running this test on SQL Server (locahost) runs at about 4.8 seconds (reported from TestDrive.Net, and include all the initialization) running it on SQLite results in 3.4 seconds. Running an empty method on TestDriven.Net takes about 0.8 seconds. Most of the time is spent in the initial configuration of NHibernate, though.

Hope this helps....

time to read 4 min | 767 words

I've started looking at writing integration tests for an ASP.Net application. I have a farily complex application with quite a bit happening in the UI layer. Some pages are fairly simple data entry forms, and some contains UI that is scary to just think about (the amount of work it would take). It has long been my belief that it is not worth testing the UI layer. A lot of effort goes into this, and it is usually fragile in the face of changes.

The problem is that without tests is it very easy to break stuff, and I get lost in the amount of pages / controls that I have there already. It took a failure (in the middle of a client demo, of course) to convince me that it is not enough to verify changes manually. I'm currently investigating WATIR, and it looks like it is farily simple to work with it once we learn ruby, the API and the quirks.

Current things that I have issues with are speed, state, enconding, controls naming, popups and alerts.

Speed seems to be an issue, so far I have only tested it in interactive mode, but it looks like it takes quite a bit of time to run. Quite a bit of time means that I can see things happening. Looking at the documentation, I noticed such things as -f (fast), so it may be a debug mode slowdown so I could keep track of what is going on.

State is the issue of what is the current state of the application for the test. For instance, I may want to try updating an entity, and this means that it have to exists, or creating an entity when it has a certain parents, etc. This require a lot more thought than just Unit Tests, since in Unit Tests I either mocks the whole infrastructure layer (and it is not fun, trust me), or I nuke the entire DB between tests. Testing it via ASP.Net is more complex, since I have to take into account such things as caching, etc. This make it a more real test case, but make it harder to write the test. Oh well, at least the secretary wouldn't do it.

Enconding may be a problem. This is still a heresay only, but I understand that ruby has issues with unicode. A lot of the texts that I need to verify in my pages is in Hebrew, so this may be a real problem. We haven't run into it yet, but we are just beginning.

Controls naming is an ugly beast in ASP.Net, you get names like this one "ShowArchive1$dgArchive$_ctl3$_ctl0", and they may change very easily. I really don't like to see them in the tests. I think that using indexes to find the controls is just as evil in any case.

Rant: Why on hell WATIR indexes are 1-based?

Popups and alerts seems to be a weak point in WATIR, I couldn't get it to work no matter what I did, and eventually I had to resort to this to get a simple confirm dialog to work. Just so you would understand, this opens a whole new process just to send an OK to the window. There doesn't seem to be any way to get the text of the cofirm/alert window. More worrying is the popup functionality, I can't find a way to handle a modal popup nicely. It looks like I would need two tests there, one to handle the code up to the popup and afterward, and another to handle the popup itself.

Some interesting links about WATIR:

time to read 10 min | 1838 words

A couple of days ago I asked about how to test that code that generates code. Basically, there are two or three options:

  • String comparisions - Fragile, but easiest to do.
  • Use a C# Parser and check the resulting DOM
  • Compile the code and test that

I went with the third option, mostly because it was the easiest to write. I had another project where I tried the strings approach, and I ended up not running the tests because they were so fragile.

Here it the first test that I wrote (the project here in NHibernate Query Generator):

[Test]

public void CanGenerateCodeThatDoesnotProduceErrors()

{

       StringBuilder sb = new StringBuilder();

       TextReader reader = new StreamReader(GetSampleStream());

       TextWriter writer = new StringWriter(sb);

       QueryGenerator generator = new QueryGenerator(reader, new CSharpCodeProvider());

       generator.Generate(writer);

       string code = sb.ToString();

       CodeDomProvider provider = new CSharpCodeProvider();

       CompilerParameters cp = new CompilerParameters();

       cp.GenerateInMemory = true;

       cp.OutputAssembly = "Generated.Context";

       cp.ReferencedAssemblies.Add(typeof(ISession).Assembly.Location);//nhibernate

       cp.ReferencedAssemblies.Add(typeof(NamedExpression).Assembly.Location); // named expression library

       CompilerResults results =  provider.CompileAssemblyFromSource(cp, code);

       Assert.AreEqual(0, results.Errors.Count);

}

After writing this, I compiled and run, and it worked. Empty code appears to compile, so I refactored a bit, moved things to the Setup and util methods (CompileCode(), AssertCodeCompiles() and GetAssemblyFromCode).

Then, it was time to write a meaningful test, like this one:

[Test]

public void GeneratedAssemblyHasWhereTypeWithNestedCustomerType()

{

    Assembly asm = GetAssemblyFromCode();

 

    System.Type whereType = asm.GetType("Query.Where");

    Assert.IsNotNull(whereType, "Should have gotten an assembly with a where type");

 

    PropertyInfo customerProperty = whereType.GetProperty("Customer");

   

    Assert.IsNotNull(customerProperty, "Where type should have property Customer");

}

Since I have compilable code, I can use Reflection to verify that I generated the code that I wanted. From there, it was a simple matter of writing a test using Reflection, and then writing the code that would generate the expected output.

time to read 14 min | 2770 words

This post is directed at yours truly. Take a look at this class:

(Image from clipboard).png

The ISessionManager interface is provided by Windsor NHibernate Integration, it gives me an access to NHibernate's ISession, which in turn gives me an access to the database.

It is really easy to write code for this system, you pull some stuff from the database, minor processing, and it is done. It is testable, because you are using only interfaces. The problem is with the ease of testing. I couldn't quite put my hand on why my tests were cumbersome to write.

Take a look at this snippet:

[Test]

public void LogErrorIfFileHasNotArrived()

{

    MockRepository mocks = new MockRepository();

    ISessionManager sessionManager = GetSessionManagerAndSetupMocks(mocks, new ArrayList());

Where GetSessionManagerAndSetupMocks is defined:

 

private static ISessionManager GetSessionManagerAndSetupMocks(MockRepository mocks, IList list)

{

    ISessionManager sessionManager = mocks.CreateMock<ISessionManager>();

    ISession session = mocks.CreateMock<ISession>();

    ICriteria criteria = mocks.CreateMock<ICriteria>();

    Expect.Call(sessionManager.OpenSession()).Return(session);

    Expect.Call(session.CreateCriteria(typeof (FileState))).Return(criteria);

    Expect.Call(criteria.Add(null)).IgnoreArguments().Repeat.Twice().Return(criteria);

    Expect.Call(criteria.List()).Return(list);

 

    session.Flush();

    session.Dispose();

    return sessionManager;

}

And that is for a fairly simple test. This creates an environment where it is pretty hard to write the tests, because I need to setup quite a bit of code, and it has to be intimately familiar with the structure of the class under test. I am dealing more with infrastructure than with testing something that adds a real value.

I am currently refactoring all the usages of ISessionManager (and NHibernate) to satallite interfaces/classes, which allows me to transform the above test to this:

[ Test]

public void LogErrorIfFileHasNotArrived()

{

       MockRepository mocks = new MockRepository();

       ICheckFileArrivedDataHelper dataHelper = mocks.CreateMock<ICheckFileArrivedDataHelper>();

 

       MemoryAppender appender = new MemoryAppender();

       BasicConfigurator.Configure(appender);

 

       Expect.Call(dataHelper.GetFilesNotMarkedByWatcher(3)).Return(new FileState[0]);

 

       mocks.ReplayAll();

 

       CheckFileArrivedTask fileNotArrivedTask = new CheckFileArrivedTask(dataHelper, 3);

 

       fileNotArrivedTask.Execute();

 

       LoggingEvent loggingEvent = appender.GetEvents()[0];

 

       Assert.AreEqual(Level.Error, loggingEvent.Level);

 

       string msg = @"File type: 3 did not arrived in its period!";

 

       Assert.AreEqual(msg, loggingEvent.MessageObject);

 

       mocks.VerifyAll();

}

The part in large font replaced all the code in GetSessionManagerAndSetupMocks. Now the intent of the test is much clearer, and it is far easier to test this stuff. I no longer have to match every little thing that is going on there, instead, I can focus on the bigger picture.

I still need to test the implementation of the data helper, but that can be done in the integration tests.

TDDing Log4net

time to read 1 min | 146 words

I have an application where logging is a business concern.
Usually, logging is something that I do so I could remotely debug the application, or have a better understanding of what is going on. This is the first time where the logs are actually something that is a business concern.
I usually don't tests logging, so I wasn't sure how to start. I probably should mention that I am using log4net for the logs.

After searching a bit, I found that it is really simple:

[SetUp]
public void TestInitialize()
{
    _memoryAppender = new MemoryAppender();
    BasicConfigurator.Configure(_memoryAppender);
}

And then, in the tests, all you need to do is:
LoggingEvent[] events = _memoryApender.GetEvents();
And start asserting.

Strangely, the best place to read about log4net strengths is here, in the comments for a blog post (check the second comment by Ron Grabowski).

time to read 7 min | 1241 words

While it is possible, it is a pain in the ass. I'm working on a project now that doesn't use NHibernate (yes, I am surprised too).

There is very simple data acess there, and nearly nothing to do with touching the data, so I thought that I could simply use ADO.Net. It works, sort of, but my code is full of strings, and it is nearly untestable. The main issue is that all the *DataAdapter expect their own *Command, *Connection, etc. Which mean that I can't mock much of it.

Check out this little piece of a test (part of about 50 lines of setup, that leads to two lines of real test.

MockRepository mocks = new MockRepository();

IDbConnectionProvider connectionProvider = mocks.CreateMock<IDbConnectionProvider>();

IDbConnection connection = mocks.CreateMock<IDbConnection>();

IDbCommand getRecordsCommand = mocks.CreateMock<IDbCommand>();

ISender sender = mocks.CreateMock<ISender>();

SetupResult.For(connectionProvider.Connection).Return(connection);

IDbCommand deleteItems = mocks.CreateMock<IDbCommand>();

IDbDataParameter dataParameter = mocks.CreateMock<IDbDataParameter>();

IClock clock = mocks.CreateMock<IClock>();

SetupResult.For(clock.Current).Return(new DateTime(2006, 12, 20, 22, 45, 33));

 

connection.Open();

Expect.Call(connection.CreateCommand()).Return(getRecordsCommand);

 

string spToGetItems = "GetLockedRecords";

string spToDeleteItems = "DeleteItems";

getRecordsCommand.CommandText = spToGetItems;

getRecordsCommand.CommandType = CommandType.StoredProcedure;

getRecordsCommand.Dispose();

I could do the same with NHibernate in about three lines, and that would be easy. I'm considerring ripping what I already have and going that route. It would certainly be easier.

time to read 1 min | 82 words

This is just to vent a personal frustration. There is nothing really new or innovative here.

Recently I've been spending a lot of time talking about a specific feature about how it should be implement and what the concerns should be and how it should be handled, etc. The sad part is that I could have implemented the feature and get working code in about one third of the time that it took to discuss the issues.m

FUTURE POSTS

No future posts left, oh my!

RECENT SERIES

  1. Challenge (75):
    01 Jul 2024 - Efficient snapshotable state
  2. Recording (14):
    19 Jun 2024 - Building a Database Engine in C# & .NET
  3. re (33):
    28 May 2024 - Secure Drop protocol
  4. Meta Blog (2):
    23 Jan 2024 - I'm a JS Developer now
  5. Production Postmortem (51):
    12 Dec 2023 - The Spawn of Denial of Service
View all series

RECENT COMMENTS

Syndication

Main feed Feed Stats
Comments feed   Comments Feed Stats
}