How Technology Consulting Firms Die

I recently got an email from 4 former colleagues, all on the same day, that a firm I used to work for a long time ago had missed payroll. It was a very saddening thing, for sure – one of those things that make you ache for people that are stuck in companies that are falling apart. Because of the nature when bad consulting firms die (bad economies) rather than limp sickly along (good economies) – the death of a firm almost always happens at the worst time for the true believers who stuck through to the end.

How does this happen though? What is it that makes a firm die.

At one level, technology consulting is not that complex of a business. You have people, you bill them out on projects, and the difference between what they bill and what they cost is your profit. How can you possibly screw that up?

A healthy firm retains earnings during the good times, so it can carry at least some of it’s people through the lean times – at least to a point. There are firms that don’t do this, firing savagely exactly to demand (aka “zero bench”), but most of them quickly devolve into body shops since people generally figure out when they are being sold a consulting company when they really are working in a contracting company. If we are talking about consulting, then we have to assume one of the functions of the consulting firm is to smooth out demand so that consultants can be buffeted against the raw ups and downs of the markets. A firm that fails to put up these buffers does not die in the sense I am talking about, but rather, it simply morphs into a staffing company.

When a consulting firm dies, on the other hand, it is usually when whomever runs it decides to suspend reality and operate as though things are fine for far too long. Consulting firms who experience sales lulls will have to lay off consulting staff at some point. If demand goes down 20% over the course of a year, and it stays there, there will need to be adjustments to staff to meet the new lower level of demand. Cutting too slowly typically will cause utilization of the entire consultant base to slip below the minimum level to create any profit at all. Such a firm starts to take on debt to keep people, since there is no other source of money to make payroll.

What happens then? Well, taking on debt means that, all things being equal, getting to “even” will be harder. Since getting even is more difficult, riskier and risker projects have to be undertaken, or the cuts that do occur have to be of the much more savage variety. The action that gets taken, at this point – tends to be much more difficult, harmful to morale, and ultimately, hurts the chance of recovery in the first place.

By this time, if fear has not set in yet, it probably is now. Everything is high stakes. Every deal *has* to get done. Even if it means compromising the estimate, lying to the customer, or otherwise taking more risks. Round and round we go in the death spiral, until there is nothing left other than those few consultants who managed to survive despite the company – being sticky at their client – rather than because of the company.

Sadly, by this time, the number of billing consultants bringing in revenue is small, but the management overhead has not shrunk nearly enough, and even if you operate a zero bench, and paid off the debt, AND cut salaries significantly, it still can’t make money. Making payroll becomes an act of further and further desperation, until such time as things like receivables loans are being taken out. It is not long before bankruptcy ensues, and the only asset left are the aforementioned consultants who are on long term staff augmentation assignments. If that. Sometimes, nothing is left whatsoever.

How can a firm avoid this? It helps to make sure – no – make certain – that overhead is being managed aggressively. And by overhead, we mean managers who don’t bill, directors who don’t bill, IT departments that make internal tools, and so forth. You may need some of these things, but especially during a recession, you need to look at each of these positions, especially the high level ones that tend to try to build empires of other non-billables. Personally, the humane thing to do, in my opinion, is to try to use the network you generate in the firm to find external jobs for those folks, rather than just fire em. That way, instead of having a pissed off former executive, you have a guy grateful that you made a smart business decision and found them something else to do – and possibly a new *buyer*. While I am not a big fan of consulting firms acting as job placement firms, doing so for high level non-billable staff is a big exception, as far as I am concerned.

The next thing to avoid this is to be brutal about honestly. If the pipeline really looks terrible, you have to be honest with people, and you have to be honest with yourself. It might mean you have to let people go. Doing so carefully, and based on merit – i.e. actual skill relative to salary – to the degree you can backfill projects – is critical. If you are not transparent about why certain people stay or go, it will be assumed it is based on spurious political factors. If you are going to do it on trailing 12 month utilization, at least be honest about it, so people understand the reasons.

The third thing to avoid is to make sure you continue to invest in the people you intend to keep. This is not the time to stop investing in training, networking events, or god forbid, allowing your sales people to invest in developing the business. Bad times are when you need to distinguish your capabilities and be very smart about your investments, not simply stop.

Lastly, it is critical to think with your head, not with your heart, when confronted with bad times. I see more firms get in trouble because the decision makers make those decisions out of fear, rather than evidence and data. Fear spreads, causing people to react too harshly, cut too savagely, estimate too aggressively, over-promise, and under-deliver. Check your fear at the door – don’t make business decisions from the same place you run from tigers.

Recessions are awful times. However, they do have a useful function, in that they rightfully kill off firms that ride the tide up during good times – but have no foundation to survive bad times. The upside is that better firms can take up the old project portfolios, usually doing a lot of fixing of problems that the old firm caused. Thankfully, when better times return, those survivor firms are the first in a position to pick up the consultant/victims of the previous firm.

How Technology Consulting Firms Die

New i4o Release – Fluent Syntax, IIndexSpecification

Now that my book is largely done, I finally have time to focus on some things that have not received the attention they deserve lately.  Thankfully, Jason Jarett was kind enough to introduce some great new ideas to i4o, which after having mulled over them for far too long, I am releasing into the wild as of this weekend.

Details of the changes are nicely documented on his blog.

The jist of the changes though are to introduce a better way, leveraging lambdas, to specify an index (the new IIndexSpecification<T> interface).  Prior to this release, you either had to use attributes – something you can only really do with code you control, or you had to use strings to specify the property.  IIndexSpecification<T> is a nice seperation of those concerns, so that your classes can truly be not concerned about how they are being indexed (as we have done with the POCO changes from the previous release), while more easily being refactored (i.e. member name change refactoring will now work with indexes).

In addition, performance improvements to reduce the number of lookups using reflection have been introduced as well.

I really owe a huge debt of gratitude to Jason, who took it upon himself to introduce these features and share them with the world.  I am very excited about how this has evolved over the past year and a half.

As for where we go next, now that I am done with the book and have more time to work on open source projects, the next round of changes will (finally) widen the number of query types that can leverage the index, similar to the work Rocky and I did with indexing in CSLA.  Look for that sometime in April/May of this year.

New i4o Release – Fluent Syntax, IIndexSpecification

Innovation in IT

In IT, too often, we are viewed as a cost, when we should be considered an investment.

Are you measuring ROI? We sure do talk about it a great deal. But when asked, few can point to the actual rate of return on their historical investments. This, as it should, create skeptical investors – which in your case as a CIO or IT Director, are the people that sponsor your projects.

That all said, IT as a cost center is an idea that needs to die. This presentation digs deeper into that idea, and presents a framework for finding those places where you can start finding strategic technology investments that you can make in your own company.

Innovation in IT

10 Commandments for CIOs in 2009


 
As we go into 2009 and start to wonder if there is any future for technology at all, it might not be a bad idea to first look remember that technology is one of the greatest generators of productivity known to man.  In an era where leverage is limited, and therefore you can’t simply accomplish earnings via leverage, increasing productivity starts to look more attractive.

In other words, in the coming years, sadly to say, the COO is going to matter a lot more than the CFO.

So what is the role of the CIO in all this?  Well, the CIO needs to be helping the COO build a leaner, meaner company.  This presentation is the first in a series that maps out a way for CIOs to help their companies succeed in this post-financial-apocalypse world.

10 Commandments for CIOs in 2009

Welcome to The Nomadic Developer

The time has come for a blog about the consulting business.  And a book.  But more importantly, a conversation.  Because there are millions of us, and this very topic affects our lives – the lives of the knowledge workers who write our software every single day.

My name is Aaron Erickson.  I am a Principal with ThoughtWorks, home to many like minded folks who believe, like I do, that the purpose of a company should be beyond mere commercial success, but should include a higher purpose – including social justice. I have two big passions. One is software, which I write a lot about at my corporate blog.  But another, and frankly, much larger one, is the technology services business itself – which is a chief agent affecting the software that companies large and small use every single day.

Frankly, to most who enter this business – the technology services business – the way that it works is a mystery.  You hear terms like bench time, utilization, backlog, and so forth – and you might know what they mean.  But there is also a lot of misinformation out there – and some of the more dysfunctional companies out there like it that way, because it is a good way of keeping employees in fear, under control, oblivious, and sometimes, all three at once.

It doesn’t have to be this way.  If you know the lingua franca of this business, you can quickly know things like:

  • How secure is my job?
  • How can I get promoted?
  • Is the company growing enough for me to get a promotion?
  • Is right now a good time to ask for a raise?
  • What happens if I am on the bench?
  • How will I know if this project is going to be a death march?

This blog, and my book – “The Nomadic Developer”, available on Amazon and bookstores near you, seek to answer these questions.  Stay tuned – it is going to be a fun ride!

Welcome to The Nomadic Developer

On Business Intelligence and F#

It is high time that Business Intelligence get the benefits of the language “Cambrian Explosion” and agile revolution.  Think about BI for a second.  Most of the talk around BI is oriented around tools – a stack that ties together presentation, storage, and logic, all in the name of avoiding dealing with pesky programmers.  To the point that “requires no writing code” becomes a feature point.  How did we get here?  And how do we get out?

In the olden days, if you wanted a report, leaving aside BI for the moment… you had to ask your IT department for the report.  You were on their schedule, and more often than not, the backlog was very long.  Leaving aside why this was the case (i.e.budget shortage, lack of IT/business alignment, etc.) – it was.  This begat two primary developments, the shadow IT department, and the market for tools that empowered the business to, at least try, to generate their own reports independent of IT.

Now, in the intervening years, these two developments have not really stopped at all.  There is still a ton of shadow IT, and a ton of tools that purport to help you generate business intelligence by using an integrated stack of tools that, in theory, allow BI to happen without programmers.  The question is… is this a good thing?

I would say no.  BI tools, more often than not, tie you to not only a platform, but frequently, a specific product.  You can’t take your BI developed in Microstrategy and run it in Cognos, at least not very easily.  And it makes sense why – as each of these tools competes on the basis of capabilities, and there is therefore no motivation to port the capabilities of one BI product over to another.  And because there is no obvious short term economic justification from the tool vendors point of view, it simply doesn’t happen.

Of course, the medium to long term economic justification for tool vendors for this is very good indeed.  By creating an ecosystem of BI that allows for greater innovation and better solutions, BI will receive much greater investment.  The savvy players who take advantage of this will do really, really well, just like Microsoft prospered by having an open PC platform and Google prospered by having an open internet platform.  What has to happen, however, is someone has to move first, and given the nature of the space – big corporate buyers – it has to be one of the big players to do it with any kind of credibility.

That said, it does not help that there has been little standards innovation in the world of SQL.  Not to say that it doesn’t happen, but lets put it this way – nobody is proposing SQL as a new .net language like they do for F#, Ruby, or even Boo.  SQL is just now standardizing how objects work… and worst yet, the language continues to get balkanized – especially in BI land where extensions for doing cubes and other specialized functionality tend to differ from vendor to vendor.

So how do we untie this gordian knot and get to a place where BI is portable, testable, and exists in a manner that allows diversity in authoring tools, persistence mechanism, and presentation mechanism?  I humbly submit that F# should be the language of BI.

Why F#?  Well, functional programming in general is oriented towards the folding, summarization, reduction, and calculation of sets of information – that is –data.  SQL is mostly a functional, declarative language anyway, so moving to F# as the lingua-franca of data should be a no-brainer.  Imagine a world where BI is:

* Persistence Ignorant rather than Persistence Obsessed

* Portable from tool to tool – so long as it can parse F#

* BI authoring tools allow business users to use a GUI to write F# constructs rather than balkanized SQL constructs

* The benefits of a modern functional language (ASTs, automatic generalization, massive parallellization, etc.) are finally tools that are easily available to BI

* Allowed to have the benefits that the agile world has brought us (testability, etc.)

Imagine a world where you write a Domain Specific Language (DSL) in F#, and the BI tools manipulate the DSL.  Imagine being being able to swap out different persistence mechanisms based on strict performance characteristics, rather than having to pay the port tax when you move from one persistence mechanism to another.  Few people in the BI world have been exposed to the recent “Cambrian explosion” of new languages that have emerged in the last few years, and that’s a shame, because some cross-pollenization would be very compelling for new kinds of solutions to emerge.

A recent Gartner CIO poll reported that CIOs must ‘Make the Difference’ by replacing generic IT with distinctive solutions that drive enterprise strategy.  This means that true BI that differentiates will likely be invested in.  It would be a shame if we continued to have all this BI live on vendor specific islands that were unable to leverage some of the state of the art work going on in computer science.  On the other hand, BI that leverages these new capabilities that the computer scientists like Don Syme are giving us will have a great chance to “make the difference”.

I conclude this with a call to action.  If you are doing BI, ask why we are using the same basic language we were using 10 years ago.  If you are a language geek or a software developer, ask why what you are doing, particularly if it generates information that is used in the strategic decision making process, isn’t considered “BI”.  Whomever is the first tool vendor to get to this vision will probably get to have a great deal of control over how it gets done – and the field is very green at the moment for someone to fill this gap :)

On Business Intelligence and F#

Consulting Firm Archetypes Continued: FEAR Consulting

Last weekend, while at the Twin Cities Code Camp, over a few drinks some fellow consultants and I were able to share some “war stories” from our consulting pasts.

Boy, and I thought I might have ever had it bad.  Some fellow Magenicons (my former employer) described to me some experiences they had at employers prior to Magenic I can only describe as working for “FEAR Consulting”.  FEAR consulting is the kind of place that would truly make Machiavelli proud.  At FEAR, you report time in six minute increments.  Yes, six minute increments.  In other words, day-to-day biology has a role in your time report (for the uninitiated, they probably have a line item for “bathroom” when you submit time).  Everyone hates being micro-managed, but at FEAR, that would be an improvement over the nano-management you are subject to there.

True story – at FEAR, when you go to a colleague to ask them a question, you have to barter some of your billable time with another consultant who answered the question.  Which is a serious problem, because at FEAR, you do not get your full salary unless you bill 45 hours per week.  Or you have to make up the deficit with part of your two weeks of PTO you are allocated for the whole year.  Which adds up fast.  And god forbid you go on the bench… muhaaaaa… no vacation for you!

Now, god forbid you want to participate in, say, a code camp or a user group.  You, proud developer, even land a speaking engagement on your own time on a Saturday (if you have not been called in to work).  FEAR will, literally, tell you that you can’t go, since you are an agent of the company, and they want to keep their “trade secrets” in house.

Yes, life sucks at FEAR.  Why does anyone stay there?

Well, FEAR understands that you can go a long way by making people exhausted.  They make you feel bad about yourself, so that you don’t ever go anywhere else.  In fact, at FEAR, they are masters of telling you are worthless, that you could never do any better.  They are the corporate version of the guy who controls you by killing your self-esteem, and while occasionally giving you a carrot, otherwise continually beats you down with a stick.  If you worked for them in 1999, you had no idea the market was good, because the message at FEAR is that you are always replaceable, and if you ever make a mistake, you are done, and nobody in their right mind will hire you.

I previously wrote about BOZO Consulting.  Someone working for FEAR would be infinitely better off by getting a job at BOZO, which while they are indifferent towards your progress, FEAR actively hinders it so you never step out of line.  BOZO says don’t make mistakes or take risks, but might tolerate it if you do anyway.  FEAR will dock your pay if QA reports a bug.

There are companies that fit the archetype, but I would never name them, because they tend to have very active legal departments who love to intimidate by threatening to sue… something that as a humble blogger for a good company, is not in my best interest.

I wish I could use this post to reach out to employees of FEAR and tell them there is a better way (and of course, in the process, recruit them), but unfortunately, at FEAR, not only can’t you read blogs at work, but you hate your job so much you probably spend your time as far from a computer as possible.

Consulting Firm Archetypes Continued: FEAR Consulting

C# 3.0 is a Dynamic Language

There, I said it.

I hope, finally, we can start to put the old Dynamic vs Static language schizm issue to bed.  C# now is a dynamic language.  Rubyists eat your heart out (and I mean that in good fun).

Lets go down the line – things that make a language dynamic:

‘Eval’, or more generally, ability to construct a data structure and compile it all at runtime.

All this is possible within the Expression namespace.  While I can plug my own MetaLinq with it’s ExpressionBuilder as a way to more intuitivley build an expression versus using the factory methods to do it, either way, it is clearly possible to have "Eval" style functionality within C#.  In C#, as you can in Lisp, you can write programs that write programs.  And with 3.0, you don’t even have to deal with Reflection.Emit to do so.

Higher Order Functions

Lambda Expressions are part of C# 3.0 – part of what makes a where clause in LINQ able to work.  The syntax is actually pretty reasonable as well.

Implicit Typing

Gone is the requirement that you put put a formal type on everything you use.  Now, it is a good idea that they keep that requirement for public references – things that will escape the comfortable confines of your assembly – but within that, being able to use the var keyword, especially as it relates to programs where the shape of the objects are expected to change a lot, is a good thing, contributing to the dynamicness of the language.

Continuations

Anyone that has implemented an enumerator in C# 2.x knows that we have had continuations (via the yield return statement) for some time, though I do not see them used a lot in practice.  Don "COM Is Love" Box has a great post on this from 2005 talking about the concept.

Introspection

Leaving aside the reflection namespace, the fact that you can cast Lambdas to expression trees in C# 3.0 – the whole basis for what makes something like i4o possible – is demonstrating that introspection is a huge part of the new innovative stuff coming out in C#-land.

Now, it does not mean C# is dynamically typed – which is different than being a dynamic language.  There is a great whitepaper from Erik Meijer and Peter Drayton about the subtle differences there.  However, what I am saying is that the lines are almost certainly blurring when talking about the differences between static and dynamic languages.

Of course, if we really wanted dynamic typing, we could have forgotten all this semicolon nonsense and just switched to VB 5 :)

C# 3.0 is a Dynamic Language

Announcing MetaLinq – Linq to Expressions

It is with great pleasure that I announce yet another flavor of LINQ – MetaLinq – the ability to query over and edit expression trees using LINQ.

Why MetaLinq?  Well, Oren Novotny, who is working on yet another project, SLINQ, aka Linq to Streams, and who also has contributed to i4o, was asking me if I knew an mechanism where one could walk an expression tree and replace certain nodes.  Being the sort of person who always like a challenge, I decided to help, creating an enumerator extension method for Expressions that allows you to easily walk the expression tree, with the plan of editing the nodes in the tree as I go along (i.e. using a LINQ query over the resulting "walk the tree" enumeration).  Then it hit me.  Expression trees are immutable.  I find this by not reading stuff online, but hacking away and finding all those darn properties of expression nodes are read only.

Damn.  How to get around this one?

Well, after reading Jomo Fisher’s blog, it became evident that if you want to get a variation of a tree, you have to copy part, or all, of the tree, and carefully replace the parts you want changed as you are doing so.  Thankfully, Jomo put a reasonable approach up on his blog, which uses a visitor pattern and does a selective replacement.  And his method works.

That said, I decided to take a different approach.  ExpressionBuilder, which is part of MetaLinq for the moment, is the result.  The ExpressionBuilder namespace allows you to create an Editable Shadow of an expression tree, modify it in place, and then by calling ToExpression on the shadow tree, generate a new, normal, immutable tree.  It has a class, EditableExpression, that has a factory method (CreateEditableExpression) that takes any expression, and returns an EditableExpression that mirrors the immutable Expression.

For example, to get the editable tree, you would do this to get an editable copy:

Expression immutable = someExpression; //you can’t change immutable directly

EditableExpression mutable = EditableExpression.CreateEditableExpression(immutable);

//..then do this to convert it back

Expression newCopy = mutable.ToExpression;

//pretend there are parens after ToExpression -  shortcoming in my blog software that does not allow me to say ToExpression with parens afterwards

In other words, you can now edit expression trees.  ExpressionBuilder is to Expressions what StringBuilder is to Strings.

I will warn you that you can easily shoot yourself in the foot with this.  As of the current version, you can easily create a cyclic graph, which, of course, will create an infinite loop when you try to convert it back into an immutable expression.  While I will be adding code to check for cycles in the future, there is no getting around that having full edit capability on the expression tree can cause subtle bugs.

The project is hosted on codeplex, and as with i4o, it is open source, and other contributors are welcome to help add to it or fix bugs.  For more information, email me at aaron.c.erickson@gmail.com .

Announcing MetaLinq – Linq to Expressions

LINQ to Objects and Bi-Directional Binding

There is a lot of great technology coming from Microsoft in this year – there is almost not enough time to take it all in.  That said, there are some areas where we can try to anticipate where some issues are going to occur, be they fast access to objects (i4o), or in this latest installment, enabling bi-directional binding to work correctly on results from LINQ to Objects operations.

You might ask, why is this an issue?  Well, as Rocky Lhotka as pointed out before, the results of a query from LINQ to Objects do not return a filtered view of the original collection, but a whole new object (something called a Sequence) that implements IEnumerable<T>, so you can iterate over it and, at least in a read-only sense, bind to it.  Now, presumably you could, in theory, add or remove to the result (though I don’t know for sure – have not tried yet)… but the problem is that even if you could, you would be adding or removing from a different collection, as well as losing anything that was implemented for you in the IEnumerable<T> you start with in the first place.  Which means that, if you are using CSLA.net, you not only will have wierd things happen on add/remove, but you will also break features that framework provides, like N-level undo.

Needless to say, this is the kind of thing that might rain on the LINQ to objects parade for certain kinds of use cases… if LINQ were not extensible that is :).  Writers of frameworks – especially the kind of frameworks that have custom collections, will want to implement IQueryable<T> on their custom collections in order to allow for bi-directional binding LINQ generated subsets.  IQueryable<T> allows you to, in its CreateQuery<TElement> method, to specify what exactly comes back from the different LINQ methods (i.e. Where, Select, GroupBy, etc.).  Of particular interest for filtering is handling your Where call such that it returns an IQueryable<T> whose concrete implementation you can control, rather than using the default that LINQ gives you.

The technique I am using to do this relies on two classes.  The first class is a simple collection class that implements ICollection<T> and IQueryable<T>, that I am calling CollectionExtendingIQueryable<T>.  The second class is designed to be a read/write view of the first one that is derived from the first one – called ViewOnCollectionExtendingIQueryable<T>.  This second class (the view) also implements IQueryable<T>.  The result of a LINQ query that projects an identity projection – that is – a projection of the whole objects of the enumerable we are enumerating – will now be typed to ViewOnCollectionExtendingIQueryable<T>, which can have all the stuff behavior the parent has, and the same data, but is assigned a different expression that IEnumerable<T>.GetEnumerator() will use when generating it’s result from the where clause.

More simply, we generate a read/write view collection that differs from the original collection only in it’s GetEnumerator implementation.  In most other respects (most importantly the underlying concrete collection) – it is the same object.  If you add or remove from the filtered version, you remove from the concrete collection, which potentially, removes it from other filtered views.

Of course, there is a lot of other work you have to do to fully implement IQueryable<T>.  I have done some of it, such as making sure non-identity projections work like normal LINQ projections – but I am sure there is other work needed to do a full implementation.  That said, the important part of this is that it proves the concept that you can have LINQ to Objects and support filter style projections, if you are willing to dive into supporting IQueryable<T>.

Source code is below, with a demo console implementation:

FilteredCollection.cs:

using System;
using System.Linq;
using System.Collections.Generic;
using System.Linq.Expressions;

namespace TestExtendingIQueryable
{
    //NOTE: This is a proof of concept – not designed to be production code

    public class RandomThing
    {
        public int SomeVal;
        public RandomThing(int x) { SomeVal = x; }
    }

   
    public class CollectionExtendingIQueryable<T> : ICollection<T>, IQueryable<T>
    {
        public CollectionExtendingIQueryable()
        { _internalList = new List<T>(); _ex = System.Linq.Expressions.Expression.Constant(this); }
       
        protected Expression _ex;
        protected List<T> _internalList;

        internal List<T> UnderlyingList { get { return _internalList; } }

        public void RemoveBottomItem()
        {
            _internalList.RemoveAt(0);
        }

        public void Add(T item)
        {
            _internalList.Add(item);
        }

        public int FilteredCount
        {
            get
            {
                int cnt = 0;
                foreach (T item in this)
                    cnt++;
                return cnt;
            }
        }

        public int UnfilteredCount
        {
            get
            {
                int cnt = 0;
                foreach (T item in _internalList)
                    cnt++;
                return cnt;
            }
        }

        #region IQueryable<T> Members

        IQueryable<TElement> IQueryable<T>.CreateQuery<TElement>(Expression expression)
        {
           
            MethodCallExpression mex = expression as MethodCallExpression;
            switch(mex.Method.Name)
            {
                case "Where":
                    return (IQueryable<TElement>) new ViewOnCollectionExtendingIQueryable<T>(expression, this);
                case "Select":
                   
                    UnaryExpression selectHolder = mex.Arguments[1] as UnaryExpression;
                    LambdaExpression theSelect = selectHolder.Operand as LambdaExpression;
                   
                    Expression<Func<T, TElement>> selectorLambda
                        = Expression.Lambda<Func<T, TElement>>(theSelect.Body,theSelect.Parameters);
                    Func<T, TElement> selector = selectorLambda.Compile();
                    return this.Select<T, TElement>(selector).AsQueryable<TElement>();
                default:
                    return null;
            }
        }

        TResult IQueryable<T>.Execute<TResult>(Expression expression)
        {
            throw new Exception("The method or operation is not implemented.");
        }

        #endregion

        #region IEnumerable<T> Members

        IEnumerator<T> IEnumerable<T>.GetEnumerator()
        {
            MethodCallExpression mex = _ex as MethodCallExpression;
            UnaryExpression whereHolder = mex.Arguments[1] as UnaryExpression;
            LambdaExpression theWhere = whereHolder.Operand as LambdaExpression;
            Expression<Func<T, bool>> theParmedWhere
                = Expression.Lambda<Func<T, bool>>(theWhere.Body, theWhere.Parameters);
            Func<T, bool> filter = theParmedWhere.Compile();
            //if we had indexes in this collection, they would be used here
            foreach (T item in _internalList)
                if (filter(item))
                    yield return item;
        }

        #endregion

        #region IEnumerable Members

        System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
        {
            MethodCallExpression mex = _ex as MethodCallExpression;
            UnaryExpression whereHolder = mex.Arguments[1] as UnaryExpression;
            LambdaExpression theWhere = whereHolder.Operand as LambdaExpression;
            Expression<Func<T, bool>> theParmedWhere = Expression.Lambda<Func<T, bool>>(theWhere.Body, theWhere.Parameters);
            Func<T, bool> filter = theParmedWhere.Compile();
            foreach (T item in this)
                if (filter(item))
                    yield return item;
        }

        #endregion

        #region IQueryable Members

        IQueryable IQueryable.CreateQuery(Expression expression)
        {
            _ex = expression;
            return this;
        }

        Type IQueryable.ElementType
        {
            get { return typeof(T); }
        }

        object IQueryable.Execute(Expression expression)
        {
            throw new Exception("The method or operation is not implemented.");
        }

        Expression IQueryable.Expression
        {
            get { return _ex; }
        }

        #endregion

        #region ICollection<T> Members

        void ICollection<T>.Add(T item)
        {
            _internalList.Add(item);
        }

        void ICollection<T>.Clear()
        {
            _internalList.Clear();
        }

        bool ICollection<T>.Contains(T item)
        {
            return _internalList.Contains(item);
        }

        void ICollection<T>.CopyTo(T[] array, int arrayIndex)
        {
            _internalList.CopyTo(array,arrayIndex);
        }

        int ICollection<T>.Count
        {
            get { return _internalList.Count; }
        }

        bool ICollection<T>.IsReadOnly
        {
            get { return false; }
        }

        bool ICollection<T>.Remove(T item)
        {
            return(_internalList.Remove(item));
        }

        #endregion
    }

    public class ViewOnCollectionExtendingIQueryable<T> : CollectionExtendingIQueryable<T>, IQueryable<T>
    {
        protected Expression _specificEx;
       
        public ViewOnCollectionExtendingIQueryable(Expression ex, CollectionExtendingIQueryable<T> baseCollection)
        {
            _internalList = baseCollection.UnderlyingList;
            _specificEx = ex;
        }

        IEnumerator<T> IEnumerable<T>.GetEnumerator()
        {
            MethodCallExpression mex = _specificEx as MethodCallExpression;
            UnaryExpression whereHolder = mex.Arguments[1] as UnaryExpression;
            LambdaExpression theWhere = whereHolder.Operand as LambdaExpression;
            Expression<Func<T, bool>> theParmedWhere
                = Expression.Lambda<Func<T, bool>>(theWhere.Body, theWhere.Parameters);
            Func<T, bool> filter = theParmedWhere.Compile();
            //if we had indexes in this collection, they would be used here
            foreach (T item in _internalList)
                if (filter(item))
                    yield return item;
        }

        #region IQueryable<T> Members

        IQueryable<TElement> IQueryable<T>.CreateQuery<TElement>(Expression expression)
        {

            MethodCallExpression mex = expression as MethodCallExpression;
            switch (mex.Method.Name)
            {
                case "Where":
                    _specificEx = expression;
                    return (IQueryable<TElement>) this;
                case "Select":

                    UnaryExpression selectHolder = mex.Arguments[1] as UnaryExpression;
                    LambdaExpression theSelect = selectHolder.Operand as LambdaExpression;

                    Expression<Func<T, TElement>> selectorLambda
                        = Expression.Lambda<Func<T, TElement>>(theSelect.Body, theSelect.Parameters);
                    Func<T, TElement> selector = selectorLambda.Compile();
                    return this.Select<T, TElement>(selector).AsQueryable<TElement>();
                default:
                    return null;
            }
        }

        TResult IQueryable<T>.Execute<TResult>(Expression expression)
        {
            throw new Exception("The method or operation is not implemented.");
        }

        #endregion

        #region IEnumerable Members

        System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
        {
            throw new Exception("The method or operation is not implemented.");
        }

        #endregion

        #region IQueryable Members

        IQueryable IQueryable.CreateQuery(Expression expression)
        {
            throw new Exception("The method or operation is not implemented.");
        }

        Type IQueryable.ElementType
        {
            get { return typeof(T); }
        }

        object IQueryable.Execute(Expression expression)
        {
            throw new Exception("The method or operation is not implemented.");
        }

        Expression IQueryable.Expression
        {
            get { return _specificEx; }
        }

        #endregion
    }
}

Program.cs:

using System;
using System.Linq;
using System.Collections.Generic;

namespace TestExtendingIQueryable
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine("Demonstration of using LINQ to generate filtered results, using a");
            Console.WriteLine("collection class specifically designed to filter rather than project");
            Console.WriteLine("by default.");
            Console.WriteLine("");
            Console.WriteLine("We will generate a collection with 100 random numbers, between 1 and 300");
            Console.WriteLine("We will then generate two views on the numbers, one for those below 100,");
            Console.WriteLine("the other for those below 200.  Removing the bottom most item, which is");
            Console.WriteLine("the only item that wont be random (fixed at 42, to fit in both ranges)");
            Console.WriteLine("will affect the count of all three collections (original, filteredview,");
            Console.WriteLine("and different filtered view.");
            Console.WriteLine("");
            Console.WriteLine("Lastly, we will do a typical projection, which will demonstrate that");
            Console.WriteLine("the filtering logic gets out of the way when you select something that");
            Console.WriteLine("is not amenable to filtering.");
           
            CollectionExtendingIQueryable<RandomThing> random = new CollectionExtendingIQueryable<RandomThing>();
            Random rnd = new Random();
            random.Add(new RandomThing(42)); //first one has to be under 100 to run our removal tests correctly
            for (int i = 0; i < 99; i++)
                random.Add(new RandomThing(rnd.Next(300)));
                       
            var filteredResult = from r in random
                                where r.SomeVal < 100
                                select r;

            var differentFilteredResult = from r in random
                                          where r.SomeVal < 200
                                          select r;

            Console.WriteLine("Filtered results (random numbers under 100)");
            foreach (var x in filteredResult)
                Console.Write(x.SomeVal + ",");
            Console.WriteLine("");
            Console.WriteLine("———————————-");
            Console.WriteLine("Filtered result Count = " + ((CollectionExtendingIQueryable<RandomThing>)filteredResult).FilteredCount);
            Console.WriteLine("Different filtered result Count = " + ((CollectionExtendingIQueryable<RandomThing>)differentFilteredResult).FilteredCount);
            Console.WriteLine("Now we are going to remove the bottom item from the first filtered result, which should reduce the count of all filtered results by one");
            Console.WriteLine("Press any key to continue…");

            Console.ReadKey();

            Console.WriteLine("Count of original collection = " + random.UnfilteredCount);
            ((CollectionExtendingIQueryable<RandomThing>)filteredResult).RemoveBottomItem();
            Console.WriteLine("Count of original collection after removal from filtered list = " + random.UnfilteredCount);
            Console.WriteLine("Count in the filtered list = " + ((CollectionExtendingIQueryable<RandomThing>)filteredResult).FilteredCount);
           
            Console.WriteLine("Count in the different filtered result = " + ((CollectionExtendingIQueryable<RandomThing>)differentFilteredResult).FilteredCount);
            Console.WriteLine("Press any key to test projection…");
            Console.ReadKey();

            var projectedResult = from r in random
                                  where r.SomeVal < 100
                                  select r.SomeVal;
            foreach (var x in projectedResult)
                Console.Write(x + ",");
            Console.WriteLine("");
            Console.WriteLine("———————");
            Console.WriteLine("Press any key to exit the demo…");
            Console.ReadKey();

        }
    }
}

 

LINQ to Objects and Bi-Directional Binding