Ayende @ Rahien

Refunds available at head office

Lectures Abstracts

Note: I just spend quite a bit time writing this, I am putting it on my blog so I will remember it exsts.

Who am I?

Oren Eini is a senior developer in We!, a consulting group based in Israel, focusing on architecture, data access and best practices. Most often, he is working on building complex business systems using .Net 2.0, NHibernate and Castle's Frameworks. Oren is an active member in several Open Source projects, including (but not limited :-) ) NHibernate, Castle and Rhino Mocks. He has a blog at http://www.ayende.com/Blog/ where he publish his thought every once in a while.


Level 4/300: Object Relational Mapping += 2: More then just data <-> object

Object relational mapping are becoming only more popular, as people developing complex systems find that they need more than the tabular model to work with in their applications. A sophisticated ORM can do a lot more than merely get the data out of the database in object form, it can be a valuable assest in simplifying development and making things possible. In this session, you will see how you can utilize an ORM in untraditional ways to get an additional, better, approach to solving complex issues.
Some of those ways include business rules, localization, state transitions, inversion of control, etc. All done via the ORM layer, and all can be used to drasticly simplify the complexity of the given scenarios.


Level 100: Using Active Record to write less code


What would you say if I told you that you can stop writing data access code in .Net? Aren't you tired of writing the same thing over and over again, opening connection, querying the database, figuring out what to return, getting back untype data that you need to start putting on the form? Do you really see some value in writing yet another UPDATE statement?
The Active Record framework allows you to fully utilize the power of the database, but without the back breaking work that it used to take. Active Record uses .Net objects to relieve you from the repeating task of persistance. Those objects are schema aware and can persist and load themselves without you needing to write a single line of SQL. Building business application using Active Record is a pleasure, the database stuff just happens, and you are free to implement the business functionality.

Presentation for this can be found here: http://www.ayende.com/91/section.aspx/download/160

Level 200: Rapid (maintainable) web development with MonoRail


If you're a fan of Ruby on Rails and want to see similar capabilities in .NET, or you're an ASP.NET developer looking for an easier way to do things, MonoRail will be irresistible once you find out what it can do for you. Strong support for Ajax makes writing buzzward compliant web applications a breeze. Utilization of the Model-View-Controller architecture and convention over configuration makes web development with MonoRail a pleasure. Free yourself from page-life cycle issues and viewstate worries, start working with MonoRail, where the framework works for you.
This talk will introduce the general concepts of the framework, and how you can use them

Level 100: Interaction based testing With Rhino Mocks

Beyond the simplest scenarios, all objects had collaborators that they work with. This flies in the face of testing objects in isolation. This is the problem that mock objects were created to solve. In this talk you will learn what mock objects are, how to utilize them and best practices on when / how to utilize them. Rhino Mocks is a mock objects framework for .Net whose core goals are to let the developer rely on the compiler work well with refactoring tools.

Level 200: Inversion of Control and Dependency Injection: Breaking out from the dependecy hell

Responding to change is the holy grail of software development. Inversion of Control (IoC) and Dependency Injection (DI) are two related patterns that allows to make significant changes to an application without having to touch every part of the application. IoC and DI encourage breaking the application into discerete, highly cohesive parts, so a change, when it eventually comes, is very local. A nice benefit is that applications that uses IoC are also very testable applications.
This talk will introduce the concepts of IoC and how to use them in your application.

Presentation can be found here: http://www.ayende.com/91/section.aspx/download/145

Level 4/300: Advnace usages of Inversion of Control containers

You already understand the concepts of Inversion of Control and Dependency Injection, now is the time to see how far we can make the IoC container works for us. This talk will focus on using an IoC container in complex scenarios. We will talk about generic decorator chains and generic specialization, contextful containers and IoC DSLs. These powerful concepts can greatly enhance your ability to respond to change in your application.

Level 300: Writing Domain Specific Languages in Boo

Domain Specific Langauge is not just the DSL SDK from Microsoft. A DSL can make working with the domain much easier, since you are capable of leveraging the domain concepts directly. The other alternative to a DSL is an XML file, and we all know how well declarative model can work when you need imperative concepts, just consider NAnt for a minute and you will see the issue. Usually, writing a DSL in .Net would be a complex issue, requiring writing a parser, interpreter, etc. Boo already handles all of that, and its open architecture means that it is very easy to extend it to express the concepts of the domain. This talk will show you how to build DSLs in Boo and how to utilize this power in your applications.

The thread in the haystack...

I was urgently called today to solve a problem in one of my applications. The problem was that for some reason, the application would stop process incoming files, a restart would fix this issue for a short while, after which the problem would re-appear. The application is heavily multi threaded, and has a lot of logging built in to help diagnose issues that may crop up.

The problem is that the problem never even showed up in the logs. It was as if the application stopped watching the directory, but it should never do that. I began to consider framework bugs and was about to start investigating how to investigate this issue when I noticed a recurring pattern in the logs:

Cycle #123:
Checking status of Item #42: Not yet ready
Checking status of Item #43: Not yet ready
Checking status of Item #44: Not yet ready
Checking status of Item #45: Not yet ready
Checking status of Item #46: Not yet ready

Cycle #124:
Checking status of Item #42: Not yet ready
Checking status of Item #43: Not yet ready
Checking status of Item #44: Not yet ready
Checking status of Item #45: Not yet ready
Checking status of Item #46: Not yet ready

Hm... an item is sent to another machine for processing, and the communication between the two parts is done via a web service, on a fairly slow connection.

What happened was interesting, there was a large number of items that has not been processed yet, and at every cycle (~1 minute) all of them were checked indepdently. The problem was that when it got to the point where enough of them were being queries, the next cycle began before the first one could begin.

Since checking the status of all the items was done on a thread pool thread, and since processing a new item (and logging it) also accurrs on a thread pool thread. After the application has been running for a while, asking the second server for status starts to consume most of the thread pool threads, and processing of new items is delayed until their turn in the queue arrives (although that is not guranteed).

The "fix": change the cycle time for checking status to 10 minutes and asking the other side to start processing this on more than a dialy basis.

Tags:

Published at

Localizing NHibernate: Contextual Parameters

Please note that this is no longer supported behavior in NHibernate 2.1 and up. It was a hack to begin with, and it isn't guaranteed to continue working.

I can't think of a good name for this post, but it is a very cool discovery. First, let us examine the scenario, a customer of mine has bought a database of localized data, which had the following structure:

(Image from clipboard).png

Now, what the customer wanted to be able to do is something like this:

Product product = session.Get<Product>(5);
productName.Text = product.Name;

And get the correct name for the user's culture. In addition to that, they didn't want to have to load the all the product names collection and get the name in memory. The database include several hundred thousnads of products, localized to several languagues, so this is a big perf concern.

I thought about this quite a bit, and was nearly at the point where I told them that it can't be done when I recalled that NHibernate 1.2 added filters capabilities. A quick testing proved that it is possible to do this with NHibernate, using the following approach:

First, we need to map the product, like this:

<class name='Product'>

       <id name='Id'

              column='id'>

              <generator class='native'/>

       </id>

       <property name='Amount'/>

       <property name='Size'/>

       <property name='Name'

                       formula='(SELECT ProductNames.Name FROM ProductNames

                       WHERE ProductNames.Id = id and ProductNames.CultureId = :CultureFilter.CultureId)'/>

</class>

Please note the Name property, we map it using a formula, which is just a piece of SQL that we put in the mapping. The new thing here is that we can refer to :CultureFilter.CultureId in the formula. Where does it come from? As you can see above, we need this to be transparent to the developer when using the code.

The secret is in the following bit of mapping:

<filter-def name='CultureFilter'>

       <filter-param name='CultureId' type='System.Int32'/>

</filter-def>

Here we define a filter, which takes parameters, usually a class will use the filter-def parameters to express a filter according to this parameters. But we don't want to filter the results, we want to just have it happen. It turns out to be that using the full name of the filter parameter is something that NHibernate understand anywhere, so...

//This is usually in Application_BeginRequest
session.EnableFilter("CultureFilter").SetParameter("CultureId", Thread.CurrentThread.CurrentCulture.LCID);

Product product = session.Get<Product>(5);
productName.Text = product.Name;

And you get exactly what you wanted! I was very pleased when I was able to come up with such an elegant solution :-D

Why I don't like MS Test...

In response to a question about why extend NUnit instead of MS Test, Scott Bellware gives one of the most eloquent response that I have seen in a long time.

I personally don't use MS Test.  It's a niche product built by a team that was largely detached from the developer testing community.  It's mostly visual glitz aimed at people who don't really have the experience to be discerning, and who can be manipulated into buying tools by blinking lights and shiny surfaces.  The VSTS product line segmentation is out of touch with the reality of the cross-functional roles that developers are increasingly called to play.  It's way over-priced.

There would be little value in me writing extensions for MS Test.  The limited community of folks who have adopted it are largely following Microsoft's guidance on developer testing and thus are missing the BDD point as widely as Microsoft missed the TDD point.  The open source world is innovating in this space much faster than Microsoft can hope to - both in tooling and the appraches driving the tooling.

The above summarizes my opinions on MS Test much better than I could.

Active Record & Repository

I am currently thinking about my next project, and I really want to use Active Record, and at the same time, I really want to be able to utilize Repository<T> and decorators chains or not.

A long time ago I made sure that Active Record will be usable without utilizing the "Active Record"-ness of it, so it wasn't that hard to build an IRepository<T> implementation for it, implementing UnitOfWork was a bit more tricky, since I wanted to keep the option to use NHibernate / Active Record at will, as always, another layer of abstraction is always the answer.

I went to such lengths mainly because I didn't want just to get the Active Record RAD capabilities, but to take advantages of the other advantages that it offers (Validation is one, ARDatabind is another, etc). I am not so sure that this is a good idea, though.

I am considerring using Active Record to just generate the mapping (since it has very strong cross-inferencing capabilities), but I am not sure if that is a good idea in the long run...

Any ideas?

A Challange: Simple HR Model + Rules

It seems like no one is ready to take my Linq Challange, so I decided to expand it a bit and meet my own standards. I decided to implement the challange in Active Record, since it is much faster to work with than NHibernate. Note that this is just the object model and the mapping, nothing more.

Overall, this is 800 lines of code and 20 lines of configuration, and I think that it took about three to four hours to build this demo.

Here is the object diagram:

OhMy.png

And the database model:

database.PNG

You can download the sample project here, I included a SQL Script to create the database in the file.

Just to note, this is not a toy sample, this is a complex model that is capable of expressing very annoying business requirements. I added a bit of recursive rules, just to make it a bit more realistic. The real project has a lot more stuff to make it easier to work with the model, but in essense, this is the way that I built the project.

Just to clarify, I am truly interested in seeing other solutions to this problem, preferably not ones in NHibernate / Active Record. Mainly because I think that this is complex enough to be a real world problem and not something that can be solved with demo code.

Code complexity

If we are talking about code comments, here is a piece of code that I don't think should deserve a comment:

public IEnumerable<RuleResult> Validate(Rule rule, DateTime date)

{

       if (false == this.IsInRange(date)) (Update: Moved to an explicit method because it is obviously not working)

              yield break;

       yield return rule.Validate(this, date);

       foreach (EmployeeSnapshot snapshot in EmployeeSnapshots)

       {

              if (snapshot.IsInRange(date) == false)

                     continue;

              IEnumerable<RuleResult> enumerable = snapshot.Validate(rule, date);

              foreach (RuleResult result in enumerable)

              {

                     yield return result;

              }

       }

}

What do you think?

MS Consulting and The Client's Best Interest

Karl Seguin has posted MS Consulting : One Consulting Company To Rule Them All, which I read with a sense of sinking horror.

The problem with software consulting firms is that their incentives likely don't line up with their client's...

[some paragraphs talking about how a consulting company can screw their clients...]

Enter Microsoft.

Assuming we are strictly talking about Microsoft technologies, Microsoft is best positioned to solve the problem.

I work in a consulting company (which is also a Microsoft Gold Partner) so I am probably biased.

Karl then goes on to the really big issue with this suggestion:

Of course, there are flaws with my approach. First, it assumes that Microsoft Consulting is able to deliver quality products, hire quality developers and properly manage them. ... Consultants would likely be pressured to push Microsoft technologies that really aren't necessary (i.e, build something for InfoPath and require the company to buy 3000 copies of the program).

I can speak from second-hand experiance with having to deal with MS consultant "advice". It consist of "use [only] Microsoft products". I recently had to battle against using SharePoint and BizTalk in a project where they are completely the wrong tools for the job. And I had to explain to management (about 9 months ago) that no, using DLinq is not going to be a viable solution for a long time yet, because a MS Consultant told them that this is the One Microsoft Way to do data access from now on.

I have a big problem with the tendecy to go with all Microsoft (and only Microsoft) solution when there are often better alternatives around. I have yet to find the Microsoft consultant that will prefer using NUnit to MS Test, depsite some serious flaws in MS Test, for instance. Or suggest using log4net instead of bringing the who EntLib to a project (thereby increasing the complexity of configuration alone by an order of magnitude).

There are some crappy consulting firms out there, one of the thing that We! does is to provide code review services for companies that wants an independent review of the product that they are getting. Some of the code that I have had to go through is so nasty it is in the monthly WTF zone. Here is an actual quote from one of the mails I had after I did a performance review on a system:

The author of this piece of code has managed to achieve the unique state of being able to go very deep into the framework, while combining absulote cluelessness of the reasons why [a problem] occured. I have to say that I am impressed with the ability to dig so deeply to find the core issue, and amazed that at the same time, he managed to so completely missed the target. This code manages to be both ugly to use and the most inefficent way to do [a particular thing] that I have yet to see. All points for inventiveness, zero points for thinking.

I recognize that getting really bad products from consulting company is something that is not that rare. There are better ways to handle this than to trust that Microsoft would do a better job. The more likely scenario is that you would start spending a lot more money of licenses (and training/consulting about how to configure/administer/manage your new software) than before.

I currently have a system in production that is using SQL Express, preciesly because the data the application is managing is small, and can be purged on a regular basis. This meant a drop of $6,000(!) in the project price. What do you think a MS Consultant would have choosen? Would it have served the client's interest better?

Karl suggests that it is easier for the customer to detect a sell pitch in the style of "you should use BizTalk" than to spot getting a really crappy product. I would say that the reverse is true. If a manager is unable to have independant code reviews, or unwilling to head their advice, there is no gurantee that they will be able to understand whatever BizTalk (or SharePoint, or MS CMS, or the like) is a good solution for the problem at hand.

I personally know of at least one BizTalk installation that I could replace with about three days of work, and get more maintainable code, better scalability, far better robustness, etc. In that case, the client certainly hasn't benefit from what BizTalk has to offer.

I can tell you that I (and We!) take a great deal of pride in what I create. I tend to not ship crappy code on a aesthetic basis. I had to go back to old projects, for code harvesting, additional development, bug fixes, etc. Producing crappy code means that I go home depressed, and I intend to do "this computers stuff" for a long time.

The core problem still exists, of course. One of the solutions is to have someone from the client side (either in-house or a third party) review the code and make sure that it matches the required standards. In most of my there is such a person, and it is my responabilities to walk them through the code and explain stuff (and have heated arguments ;-) ). The other is to have some sort of a support contract, which may include some clauses about fixing bugs in the application, acceptance tests, etc.

To sum it up, I think that it is a naive approach at best.

On Code Comments

dJeff Atwood had a post about code comments, which I am completely agree with. One of the comments to the post caught my eye, talking about the assumption made when commenting:

It is not always feasible to have a programming guru on hand to fix every issue. Not everyone has the same skill set. Sometimes companies are stuck having to maintain code in languages their current staff aren't well-versed in. Gurus aren't available either at all in some areas, or for the money some companies have alotted for their IT staff.

I had posted before about this topic, but I think that this is an interesting take on the topic. When I comment, there are some assumptions that I make about the person reading the code:

  • That s/he knows the language, or is capable of learning it, for instnace, I tend to use this quite a bit:

    public string CacheId
    {
      get
      {
         return ( ViewState["CacheId"] ?? (ViewState["CacheId"] = Guid.NewGuid()) ).ToString();
      }
    }

    If the dev reading my code don't know what the null coalescing operator is for, or understand what expression chaining is doing here, then there is not chance they can follow the rest of the code. I don't write my code to be written by gurus, but I will not limit myself to using the basic features of a language just because a newbie will not understand them. Anonymous delegates is another thing that I like to use in many places With.Transaction(delegate), for instnace.
  • That s/he have at least a rudimetry understanding of the domain and the model. This is much harder than merely understanding the language/framework, by the way. If we will return to the HR model for a second, here is a piece of code that give the employee a 10% raise:

    SalarySnapShot raisedSalary = employee.At(startDate).Salary.Copy(startDate, endDate);
    raiseSalary.Amount *= 1.1m;
    emloyee.Salaries.AddOccurance(raiseSalary);

    There is a lot going on here. For instance, Copy will return a snapshot with a modified validity dates, and adding an occurance in the salaries will adjust the other salaries dates to fit the new salary (in itself a very complex problem). To me, the code is perfectly clear, and it is not hard to follow techincally, the problem arise when someone that doesn't understand the temporal model in use tries to follow it. I had a lot of problems coming to grips with it myself. I could put a comment there explaining why I need a temporal copy and what AddOccurance is doing, but this is a coment that would need to be repeated each and every time I touch a temporal object. I consider repeated comments a nasty code smell.
  • That they understand the technology:

    With.Transaction(delegate
    {
       Employee employee = Repository<Employee>.Find(empId);
       SalarySnapShot raisedSalary = employee.At(startDate).Salary.Copy(startDate, endDate);
       raiseSalary.Amount *= 1.1m;
       emloyee.Salaries.AddOccurance(raiseSalary);
    });

    This is a piece of code that will save the copy even though we never call Save() explicitly. I will sometimes throw a Save() anyway, just because it is clearer this way, but not always.

So, what do I comment?

I comment why I am not doing things when the situation is unique:

if(start > end )
  end = end.AddDays(1);// Need to make sure that the end date is after the start date, even though we are only using the time portion.

When there is a hidden catch:

group.AddUser(currentUser);//Will propogate the addition to all linked groups automatically.

Bug fixes:

employee.AddNote(newNote);
//We have leaking this pointer here, because the AddNote set newNote.Employee = this.
//this can cause problems when we group things by employee, so we do this explictly, to set to the proxy.
newNote.Employee = employee;

Time to return to MbUnit?

I am following Andrew's blog, and it looks like MbUnit is not only being actively maintained again, but it is starting to get some really interesting features. I stopped using MbUnit when I needed some tools that were NUnit spesific and I got tired from porting them to MbUnit. Since then I had many occuations where I was frustrated by features missing from NUnit that I loved in MbUnit.

I am supposed to start a new project soon, where I am going to emphasize unit testing to a much stronger degree than we currently do (we test the business logic, but that is about it, and 95% of the tests were written by me). The way it is looking now, I think that we will use MbUnit for unit testing. With TestDriven.Net, it doesn't matter anymore...

The best part about Reserve Duty

Some random thoughts:

  • I got to tell clients, "I can't talk right now, I'm on the shooting range..." with real shots in the background.
  • I got to meet some friends I haven't seen since I left the army.
  • I only had about 2 hours of computer usage in the whole five days (compared to ~10hrs daily that I usually have).
  • It is OK not to put the phone on mute and not return calls for a couple of days.
  • Shoting is always a pleasure, of course.
  • Proving that I am replacable resource at work.
  • Five days and not a single thought about code... The last time that much time has passed, I was busy literally 24/7 in operation Aqua-White, and that was two years ago.
  • Learning to appriciate the really simple things, like sitting.

Reserve Duty: Debriefing

Here are the mandatory pictures: 

temp343.PNG

I learned several things last week:

  • If you go to the desert in December, it is going to be cold.
  • It is amazing to see the milage that the army can get out of a poor old jeap (נ"נ).
  • Riding on the back of a jeap older than me cross country hurts.
  • I actually has duties and responsabilities in the army - that came as a shock.
  • I forgot how much fun shooting is.
  • When I put on the uniforms I also put on a set on values under which it is reasonable for me to go to sleep in the middle of the freaking desert (in December), okay to sleep ~3 hours a night and work 20 hours a day.
  • It takes around three days to get back to a normal frame of mind.
  • It is amazing just how many friends I forgot that I had in the army, and how many ended up service in the same regiment as I do.
  • I took the laptop, the mp3 player and 6 books, I didn't even open the bag they were in all week.

Damn, I am glad to be back.

Vista Content "Protection"

Take a look at this, it is pretty long, but it contains a lot of stuff that frankly scares me.

In order to work, Vista's content protection must be able to violate the laws of physics, something that's unlikely to happen no matter how much the content industry wishes it were possible.

What is more scary is the amount of times I see "This cost is passed on to all consumers" in the document...

Opps! Site was down

Sorry about that, while I was in the army, the domain registration expired, and it took a while to renew because I wasn't quite sure what the issue was until today.
I got a bunch of posts coming, so be ready.

Tags:

Published at

Reserve Duty Training

(Image from clipboard).png

  • This is a gun, it kill people.
  • The wide end goes on your shoulder.
  • The narrow end is pointed at the tin can.
  • There is a trigger somewhere, look it up in the manual.

This is more or less what I expect...

 

Going of for Army Reserve Duty

In the bloody desert, all the way to the end of the world and beyond, sigh...

No idea what kind of connectivity I'll have there, so see you in a week.

I love Office 2007

I am in the middle of looong install of VS SP1, so I can't code... When I can't code, I tend to write. Since I just has a series of post criticizing Microsoft (with another one that I'm busy writing now), I wanted to stop and thank the Office 2007 team.

Office 2007 is a wonderful application suite, I am currently writing an IoC presentation in PowerPoint, and I am loving what I can do with it. Just take a look at my current slide:

(Image from clipboard).png

I am extremely annoyed with the two-four "standard" presentation themes (all of which are mostly blue), and I really like that I can create visually pleasing results in such ease. The same goes for Word 2007, I wrote my MSDN article with it, and it produced a very good looking document with very little effort.

Office 2007 makes my life easier, and to the Office team, you did a hell of a good work.

VS SP1: What the HELL is it doing?

You know you are in a bad shape when installing a patch takes longer than installing an OS. The SP1 Setup has been running for over two hours now, and it is not done yet. About 30 minutes ago it prompt me to close SQL Management Studio, so I assume it is still alive.

(Image from clipboard).png

Microsoft, this has better be worth it...

Tags:

Published at

Visual Studio 2005 Service Pack 1

Yes, it RTMed, six months late, but that it not what I am talking about here.

Whatever bugs it fixes are classified, apperantely, since the knowledge base link points to a dead end.

Nevermind, at 432Mb, it is sure to fix a lot of issues, I am certain...

Tags:

Published at

Linq Challange: HR Sample Model

Okay, here is an interesting challange spurred by a a comment Alex has left on my previous post about linq.

Given this model, how would you build it using Linq for SQL (is it possible?) and Linq for Entities? I don't really care about the table layout (in other words, feel free to build something that can make your life easier), only that the model will be able to express the required complexity cleanly. A couple of notes, this is a temporal model, which include most of the usual database semantics (1:M, N:M, M:1).

The two gotchas for the OR/M implementation is that Rule is an abstract class that has several implementations, and that a rule is always attached to an entity (which may be of several urelated type Contract/Employee/Department, etc).

I would be interested to see how both Linq implementations handle this task...

Any takers?

Microsoft & Open Source (Take #2): What would it take to show they care?

Bryan Kirschner (MS - OSS Labs, Port 25) has posted a comment to my previous post about MS and OSS. He raises several interesting points in this comment, which I would like to answer here.

If the community of .NET OSS developers feels like we don't care, I'm doing a bad job.

I am sorry to say this, but I do think that Microsoft don't care at best, actively resisting it at worst.  Off the top of my head, here are two examples of actions that I would call nefarious:

  • Refusing the Mono's BoF in the PDC (link)
  • Microsoft consultanting service that won't help you if your entired stack is MS-Approved (link)

I am ready to accept that Microsoft is a huge company and such things are deviations from the official Microsoft policy, if Bryan (or any official from Microsoft) is willing to make such a statement, by the way.

More disturbing is the complete and utter silence with regard to OSS tools in the Microsoft world from Microsoft. There are very few articles on MSDN about using OSS software, mostly centered around NUnit and dated a year or more ago. The only recent one that I could was written by me :-)

There are no Microsoft products (that I know of) that uses OSS products, when they need functionality that exist in an OSS product, they have to build their own version, even when there are not licensing issues.

There are other examples that bothers me as well, MS Test using [TestClass] and [TestMethod] instead of the [TestFixture] and [Test] is just... not wise decision in my opinion, why break the API?

[papers] ...that basically says financial support of OSS has been (the way I read it) self-interested outsourcing of some dev & exploiting projects for commercial profit.

I feel that I should repeat this again, I don't think that Microsoft ought to financially support OSS projects on the Microsoft platforms. If Microsoft feels like sending gobs of money to OSS developers, I would be very happy :-), but that is not something that I strive for. I, personally, "exploit" OSS projects for commercial profit. And my company has saw nothing but benefits from this effort. We are able to do quite a bit because we can rely on a rich set of tools and features (some of which were contributed by yours truly). I got bug fixes from other people that implements features that I needed later on, so everyone (including the community at large) benefited from that. [Would I open source the code that calculate the amount of work an employee had done in a month, no, I wouldn't. But I did added my patch for NHibernate that made such a calculation efficent.]

That's (not financing OSS projects) just business--but my gut is that isn't what'll make our communities feel great

What I am looking for from Microsoft is first of all recognition in at least some OSS projects. What do I mean by that? Well, to start with, why not publish some official documentation about NRandomProject instead of publishing anouncement of the avilability of Microsoft Random Product in 18 months.

This is a sore point for me, I had a client that I had a lot of trouble getting into using NHibernate, mostly because when they asked Microsoft, they were told "Wait for Linq". That was 8 months ago or so, by the way.

I want to see a Microsoft product that is using an OSS product because it was the best of breed tool to solve their needs. I want to see OSS speakers at conferences that are not talking about integration with Microsoft technologies. It is cool to see that you can use Team System from Eclipse, but it has very little value to me.

My main goal is to stop having to justify the use of an Open Source tool vs. a commercial (not neccecarily Microsoft, btw) one. This is a general problem in the Microsoft space, and I believe that the main cause of it is Microsoft itself. A change in the behavior in Microsoft would bring about a lot more trust in the very idea of using OSS in commercial settings.

Let me put it another way, if I was a Microsoft employee and wanted to use Ruby On Rails in a project (customer facing one). Assume for the purpose of the question is that this is a project that hits RoR sweet spots very well, and the team wants to use this to cut down the effort needed by using RoR. What would happen? Note that I am specifically not asking about the Port25 team, I know that Port25 has a lot more leeway in this specifically because you are the OSS labs.

I have the very strong feeling that the team wishes would be overruled for a Microsoft Centric solution (with biztalk in the middle, just to make someone happy :-) ).

For that matter, I would be happy to see any Microsoft software that is using an OSS libraries/tools that were not developed by Microsoft (and the BSD sockets code doesn't count)...

Tags:

Published at

Movie Review: Eragon

(Image from clipboard).png

So I went to see Eragon today, after reading the book about a year or so ago.

This is visually a very impressive movie. The dragon is easily the most realistic looking and beautiful creature that I have seen recently. There are some really amazing scenes with regards to special effects (the fights with the dragons is wonderful, for instnace).

There are several problems with this movie, every now and then a character says something that is cliche that I cringe ("We can do it, together!", etc), or acting in a completely stupid way (but he is 17, he is allowed).

The bigger problem is that there is about an hour of missing scenes in the movies. The hero goes from being a hubmle farm boy to a dragon rider in a matter of 10 minutes or so, and the dragons grows from the size of a dog to the size of your house in a matter of seconds. There is very little character development, and there is a lot of plot missing.

Tags:

Published at

Linq for Confusion

Can someone please make sense of the following statement? (Found here, on the LINQ Chat log):

Q:What is the key difference between ADO.NET Entities and LINQ 2 SQL?
A: LINQ to SQL is an ORM over your relational database schema plus some mappings. LINQ to Entities is an ORM over a conceptual object-less model (ERM) that is a mapping over your relational database schema.

And:

Q: Is there still an effort to integrate LINQ to SQL and LINQ to Entities?
A: There is an effort on going to align these products, but not to integrate them together.

Can someone please explain me the business sense behind the decision to push two (and let us not forget Linq For DataSet, which I haven't heard about lately) competing frameworks that does the same thing? Can anyone come up with a good explanation for the use cases where I would want to use one and where I would want to use the other?

This has a positioning conflict written all over it.

Tags:

Published at

Application Block Software Factory

Take a look at this:

One of the coolest new capabilities we're building for Enterprise Library v3 is the Application Block Software Factory. As its name ever-so-subtly suggests, this will be a software factory for building your own application blocks.

For some unknown reason, I was strongly reminded of this:

When we stepped back and looked at the global tool infrastructure, we determined that people were frustrated with having to manage and operate a hammer factory factory, as well as the hammer factory that it produced. That kind of overhead can get pretty cumbersome when you deal with the likely scenario of also operating a tape measure factory factory, a saw factory factory, and a level factory factory, not to mention a lumber manufacturing conglomerate holding company.

And that is all I am going to say about it today.

Tags:

Published at

MsBuild overridable tasks?

I have the following structure in a common.build file:

  • Clean
  • Compile
  • Test
  • Zip
  • Publish

What I would like to do is to be able to stick additional points in the middle for projects to execute their own stuff. For instance, I need a Merge step for Rhino Mocks that most other projects do not need.

Any ideas?

Tags:

Published at