December 2009 - Posts

Its not even January the first and I've already decided that I want to read more. Whilst I'm probably not going to read Pauls list of must reads for anyone wanting to write a storage engine more and more work I come across requires understanding of more complex algorithms. I've never done a computer science degree so have never learnt even basic stuff like a bubble sort.

Many people often bash anyone working on Microsoft technologies when they hail a new piece of technology because often its not new all but rather very dated. Its just only now that Microsoft implement it in their product. David Dewitt's keynote at PASS on column store was a real eye opener as to why some of the other database engines do well in certain markets.

Having said that many algorithms have been covered before, the processing power now available means that some that were only theoretically possible before are now possible.For example, if you have something that needs to process 1 million records and perform a few thousand calculations on each row, whats the best way to do it. You can write a single SQL statement to do it, but is there a better way.

So I think next year I'm going to be dusting off some of my old maths books and some newer books and see how far I get before my brain starts to hurt.


Posted by simonsabin | 2 comment(s)

I've just been sent this link by a colleague at work.

Her daughter's high school has just recorded a video in reverse, with lip syncing. Not a mean feat

http://www.youtube.com/watch?v=T7TI-AJi2O8

You can read more about the challenge between two high schools here

 


Posted by simonsabin | 1 comment(s)
Filed under:

I'm starting a whole series of blog posts on the use of Entity Framework and the evil it can do to your database without you knowing it. Your DBA will probably look at some of the queries that LINQ to Entities executes and will take a big gulp. They are some of the worst queries that I've come across.

The challenge is knowing how your Entity Framework model and your LINQ queries are affecting the final queries that get fired at the database. This set of posts will hopefully help you out so you know what to look for in your code that is causing a really guly JOIN clause or a set of nested derived tables. 

This post will act like an index to all those posts, this is the list I hope to post,

1. Non-Primary-Key column(s) [????] are being mapped in both fragments to different conceptual

2. The impact of joining on non PK columns.

3. Should you define associations in the model or not

4. Impact of compile time of LINQ to Entities queries

5. Literals in LINQ to Entities queries

6. How parameter naming in  LINQ to Entities is really bad for performance

7. The impact of parameter lengths with LINQ to Entities queries

If you have any comments on which of these to focus on, then let me know and I will prioritise them.


Posted by simonsabin | 2 comment(s)

Well you might think that my blog has been hijacked but it hasn't.

This is a statement I put into some feedback to the product team about SQL Server's attitude to bad plans.

If you have a bad query plan in your cache that isn't suited to all the parameter distributions you are using it with your SQL Server is effectively saying "go on beat me, I know you're doing wrong, but just beat me because I'm not going to stop you".

I'm talking about queries that normally process a few hundred pages but when executed with the wrong parameters end up processing 100s of thousands of pages. The worst I saw was a bad plan with full text that had a figure in excess of 1000 million pages being read for one query. This is the classic case of a query plan gone bad.

Why oh why can't SQL Server do anything about it, they know the estimated numbers and the actual numbers because they give then to you in the execution plan, so why not use them to detect a bad plan, and then bar it from the server.

Why can't SQL Server be a bit stricter and give the plan the boot rather than being a masochist

"1. gratification gained from pain, deprivation, degradation, etc., inflicted or imposed on oneself, either as a result of one's own actions or the actions of others

2. the act of turning one's destructive tendencies inward or upon oneself."


Posted by simonsabin | 3 comment(s)
Filed under:

Simply put you can't use the celluar emulator on a 64 bit operating system. You will get the following error "There are not seven pairs of XPVCOM in system"

How naff is that. You have to be running a 32 bit operating system.

If using windows 7, all is not lost. You can use the windows XP virtaul PC to do the work.


Posted by simonsabin | 1 comment(s)

I 've just been trying to build a silverlight player for SQLBits and have been hitting my head against a wall due to getting error

AG_E_NETWORK_ERROR

How annoying. Nothing more just that.

I know the URL is fine so what is the problem.

After doing some tracing I found requests in the log of the website I was getting the file from as follows

2009-12-03 00:56:00 fe80::e9631 GET /clientaccesspolicy.xml - 80 -
2009-12-03 00:56:00 fe80::e963:  GET /crossdomain.xml - 80 -

I don't have either. Looking at these I find that they are needed by Silverlight for security checking. As I don't have either it bombs out

So to solve the problem create a file called clientaccesspolicy.xml in the route of the website with the content and use the following

<?xml version="1.0" encoding="utf-8"?>
<access-policy>
    <cross-domain-access>
        <policy>
            <allow-from http-request-headers="*">
                <domain uri="*"/>
            </allow-from>
            <grant-to>
                <resource path="/" include-subpaths="true"/>
            </grant-to>
        </policy>
    </cross-domain-access>
</access-policy>

This gives full access. I AM SURE YOU SHOULD MAKE IT MORE RESTRICTIVE but its very late and I'm going to bed.

Anyway if you are encountering AG_E_NETWORK_ERROR and you are trying to get content from another website you will need one of these files.