Wednesday, June 20, 2012

April 2012 CU is back!


The April 2012 CU is now available again from Microsoft: http://support.microsoft.com/kb/2598151.

Their reason for the recall that happened about a month ago:
“A recent, isolated issue was discovered in the SharePoint 2010 Products April 2012 Cumulative Update that could result in an HTTP 500 error when users delete objects to include documents, lists, and Webs and a new  object is created using the same path where the original object remains in the Recycle Bin. […]This issue has been resolved in the revised packages.”

Friday, May 25, 2012

ULS Viewer real-time feeds NOT working

I've had this issue in the Development VM I'm using, where ULS Viewer would never load the logs in real-time from the LOGS folder.

I ended up finding the issue, or at least a solution for my case, in some comments on the Microsoft tool page (here), which was deleting all the old log files. The current one will be in use and won't be deleted, which is fine.

Another workaround was deleting the upgrade*.log files only, but I didn't have any, so the problem was probably in another log file. It seems that if the logs that don't follow the  formatting ULS Viewer expects, RT feeds won't work correctly.

Saturday, May 12, 2012

Regain control of SQL Server

I started using a VM from a template someone else built in the last couple of days. My account was a local machine admin, but not a SQL Server admin. I actually had no access configured in SQL Server for my account. But I needed to get control over it and had no one available to help me in that moment.

In order to get sysadmin permissions on SQL Server, I did this:
  1. Shut down SQL Server Services
  2. Open a command line as Administrator
  3. Change to you SQL Server Binn dir
    • cd "c:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Binn"
  4. Run SQL Server in single-user mode
    • sqlservr.exe -m -s MSSQLSERVER
  5. Open a new command line as Administrator
  6. Connect to SQL Server
    • sqlcmd -S .\ (or sqlcmd -S .\MSSQLSERVER)
  7.  Execute commands
    • 1> sp_addsrvrolemember 'domain\user', 'sysadmin'
    • 2> GO
  8. CTRL+C in both windows to exit sessions
  9. Restart SQL Server Services 
  10.  
    PS: You may need to be quick connecting to SQL (step 6) after starting it in single-user mode (step 4), otherwise some other process may pick up the session (remember, obviously, only one connection will be allowed...we're running in single-user mode!). Not a bad idea to have both command line windows open and ready to click Enter.

Tuesday, May 1, 2012

Unreliable Item Count in Search Scopes

Beware with the Item Count column present in Central Administration, in the Search Scope Properties and Rules interface, as it may not be accurate.

It does state it is "approximate", but what does it mean?

What happens is it queries the Search Service to get this Item Count, using the current logged on account (in Central Admin) which may or may not have permissions to see all the results. So the Item Count retrieved is actually the one pertaining the current logged user viewing this page, and not a global result.

This will be an issue in highly security trimmed contexts, where there is probably no account with access to the entire index (except for the crawler account). Therefore, the Item Count will be pretty much useless.

This design flaw in the Central Administration seems to be present since MOSS2007, as it can be seen here.

Wednesday, April 25, 2012

Run Away from Item Level Security!

I have been involved in some discussions regarding Item Level Security and its impact on the performance of a SharePoint farm. Interesting topic. Microsoft has a thorough article on this, download it here: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=9030.

Here is the what Microsoft has to say about it, from a boundaries/limits perspective (also here):


"Security scope - 1,000 per list - Threshold
The maximum number of unique security scopes set for a list should not exceed 1,000.
A scope is the security boundary for a securable object and any of its children that do not have a separate security boundary defined. A scope contains an Access Control List (ACL), but unlike NTFS ACLs, a scope can include security principals that are specific to SharePoint Server. The members of an ACL for a scope can include Windows users, user accounts other than Windows users (such as forms-based accounts), Active Directory groups, or SharePoint groups."


This means that if you believe your list will have more than a thousand items, now or in the future, Item Level Security is not recommended. Beware that once you follow that path, it will be tough to re-design your solution.
Also, a major pain point is governance. How manageable is a solution with Item Level Security? Not much.
You would need some automation: probably workflows/event receivers and some custom coded framework to help power users managing security. This means more complexity, and we still have the threshold to handle.

An important factor on this threshold is that having more than 1,000 ACL's in a list/library impacts the whole farm, not just that list (check a very interesting post about this here).

All in all, my opinion on this topic is that you should run away from Item Level Security as much as you can.
Design the Information Architecture in a way that allows you to set the security boundaries only in site collections, sites and lists/libraries. It will make everyone's lives easier. My motto has always been the simpler, the better. Over-engineering is certainly one of the most common problems a team with good developers/consultants can get.

But, if you still need some sort of Item Level Security, there are other paths you can take. I see 3 possibilities.

First option: Folder Level Security

The way folders work has been enhanced in SharePoint 2010, with things like default metadata values based on folders. It was an improvement that allowed folders to be back on the game, in my opinion.
In terms of security boundaries, it may be a perfect fit. If, inside a library, you have a reasonable amount of separate security boundaries, you may use folders to manage the security. This away, the amount of ACL's in the list would be restricted to the number of folders.
You could make it transparent to the end user, from a viewing perspective, by not using folders on views. From an upload perspective, the end user would learn that when uploading a document, choosing a folder would actually mean choosing a security boundary. Since most users do have a file share background, most will easily understand and accept that putting documents into different folders, will mean those documents  are visible to different users.

Second option: Custom components

Another option is making the list and all items super-secure, with all items inheriting security. Only super-user accounts would have read/write access. All others users would have no permissions.
We would then build custom components to view, upload and edit documents/metadata, that would be run using elevated privileges and would have its own logic.
This is not a trivial solution, there are several limitations, as well as a lot of custom development. Think about what would happen trying to save a document from the Word client, or using Explorer View. This would need to be carefully PoC'ed, planed, agreed with the customer, implemented and tested. It would largely depend on the specific requirements. My advice would be against this path.

Third option: Folder Based and Exception Based Security

A third option I can envisage is more complex. We can have a mixture of Folder Level Security with a number of exceptional behaviours. This can be used in more complex scenarios, but it always falls into the assumption that from a business perspective it is imperative to have all these documents in the same location and that having a predefined number of folders with their security and a restricted set of unique permissions on some of the documents, would solve the problem. We would need some sort of mechanism to monitor the number of exceptions + folders, to keep it within a reasonable amount. Again, my advice would be against this path.

Important factors to consider

The most important thing to have in mind is whether you are able to re-engineer or re-design your customer's process or what they believe should be implemented, in terms of security.
Sometimes, business users mistake visibility with security. Or some sort of responsibility with security. It is important to make concepts clear to them. Security could be seen as a block. Are we setting security only because we want to control what end users see by default? Or do we really need these documents blocked to these end users? I have seen a lot of cases when it is the first option. If that's the case, you can forget security and use other alternatives, like customized views based on metadata or even custom coded components for viewing. It will be much easier, cleaner and manageable to code viewing components based on any sort of logic, than pursuing a solution where you use Item Level Security.
If you do have security boundaries to respect, and these documents must be blocked, then the first option is a clear winner. It might be tough to convince your customer, but the "Software boundaries and limits" page from Microsoft is there to help you. And if they end up choosing to pursue other strategies, you have clearly made your point here, if they have problems in the future.



Visual Studio Tip #2: Run 64-bit version of PowerShell in Post-Build

Whether you are trying to achieve Continuous Integration, Automated Unit Testing or any other scenarios, it might be very beneficial to run PowerShell scripts automatically from Visual Studio after certain actions (such as Deploy or Build).

One problem I had was with the version of PowerShell that would be called from Visual Studio 2010: 32-bit or 64-bit. If you need to use SharePoint DLL's, you need to run the 64-bit version of PowerShell.

It won't work using "%WINDIR%\system32\ WindowsPowerShell \v1.0\powershell.exe", you'll get an error using the 64-bit DLLs, because the PowerShell console that is opened is the 32-bit version.

In order to call the 64-bit version of PowerShell, you must use this path: %WINDIR%\sysNative\WindowsPowerShell\v1.0\powershell.exe.

Tuesday, March 6, 2012

SharePoint 2010 Tip #8: Site Collection URL without "/sites"

It is common to have people asking why new Site Collections have to be in "/sites", or any other common location, and not in the root (http://hostname/<sitecollection>). Some people find it annoying, or not appropriate.

It doesn't have to be that way, we can have our http://hostname/<sitecollection>. And we don't need any URL rewriting or any other complex solution, just use the OOTB options correctly!

We will have a quick dive into the Central Administration, in Application Management > Manage web applications. Select a Web Application and click "Managed Paths" in the ribbon. The reason why the only option when creating a new Site Collection is "/sites" is due to the configuration there.

The default configuration is, as follows:
  • (root) - Explicit inclusion
  • sites - Wildcard inclusion
What does this mean? It means that we can have one Site Collection in the root (/) and multiple inside sites (/sites/*). That's exactly what is happening. When you create your root Site Collection you put into the root URL ("/") and when creating a new Site Collection you only have the /sites option.

The explicit inclusion type allows you to create a single Site Collection in the specific specified URL (/<url>), while the wildcard inclusion allows you to create multiple sites inside the specified URL (/<url>/*), but not in the base URL itself.

The reason for this is simple, if you had a Site Collection in that level and multiple Site Collections beneath it, IIS/SharePoint would not know (in a simple way) where to route when accessing a site in /<url>/*. Would it be a subsite in the root Site Collection or a Site Collection within it? That's why trying something like making the (root) a wildcard inclusion will not work if you plan having a Site Collection in the root as well. If you don't (only Site Collections in /*), that would actually work well.

Then, going back to our problem, how to create a specific location for our Site Collection (http://hostname/<sitecollection>)?

Just add a new managed path, enter the URL you want (<sitecollection>) and choose "Explicit inclusion" as type. The next time you create a new Site Collection, you will have this path available to you!

Once you use it in a Site Collection, it will obviously not be available in the creation of further Site Collections, which means you must configure a managed path for each Site Collection you want to place as http://hostname/<sitecollection>, which actually makes total sense.

When creating your managed paths, you should always click "Check URL" to make sure no subsite is created in that location. If it does, you will actually be unable to enter that subsite while the managed path is there, as SharePoint will always try to redirect you to the Site Collection using that path. If you haven't created the Site Collection, you will get a HTTP 404 Not Found, you will still not get redirected to the subsite. Deleting the managed path will get everything back to normal.

On the other hand, after creating a managed path and when creating a subsite in that location in the root Site Collection, you will get the following error:
Error
"<url>" cannot be used as a site name. Site names cannot contain certain reserved words and cannot begin with an underscore. Please enter a different name.

Site name actually refers to Site URL name, and not Site Title (one of those funny SharePoint error messages).

SharePoint checks if the subsite URL collides with any of the managed paths and actually enforce that no subsite is created there.

Tuesday, February 21, 2012

Visual Studio Tip #1: 'View Designer' option disappeared

This happened to me while building a SharePoint workflow.

I created an initiation form with the same name as the workflow, which generates a compilation error: the workflow class will be the same as the code behind in the initiation form. Changing it solves the compilation issue, but you competely lose the ability to view your workflow in Designer mode.

This happens because changes are performed on the .csproj when you add the initiation form, stating the class SubType is ASPXCodeBehing. When you rename the initiation form, new entries are created, but the old one remains as ASPXCodeBehind, instead of returning to the initial value, when you created the workflow.

Re-opening Visual Studio may or may not solve it.

You can go to the .csproj and make sure your class SubType is set to Component, and not Code or ASPXCodeBehind.

Thursday, February 2, 2012

Thoughts on the Microsoft Approach


To me, Microsoft Products have always had some common ground and guiding light. Microsoft's motto over all these years has been all about having products with a very user friendly interface and products that provide very quick and direct advantages, with little or no development time.

So...why aren't all products like that? Well, nothing is perfect, so sometimes having things that simple, have disadvantages. Working hard to solve the top 80% scenarios, will usually leave the other 20% in bad shape.

What does that mean in practice? Basically, you can give your customers great value for money if their scenario fits the scenario Microsoft predicted. If the configurations allow you to configure that simple thing they asked you to change. Otherwise, you may be in trouble. Changing that small thing may be close to impossible sometimes. You may need to do your own custom development from scratch, while having something that close OOTB.

So, in relation to SharePoint, how does that affect a Developer or Consultant?

In my own view of things, a SharePoint Developer should not be a standard Developer. For that matter, in almost all the platforms I have worked with, the same principle applies.
A SharePoint Developer shouldn't be focusing only on the code itself or in the best way to deploy SharePoint artifacts. Its job should be to know the platform very well, and first of all, knowing all of the platform's abilities and OOTB components.

One would say: if Microsoft is pushing SharePoint in so many business areas and to serve so many purposes, isn't it impossible to know everything about the platform?

In my opinion, yes. To become a 'de facto' expert in some of the areas explored in SharePoint, you will need several years of experience, and every day you may still be learning something new. That's just the way it is. But the goal here is to understand where your efforts should be placed. And in my opinion, you should strive to understand the full SharePoint experience.

If you are just developing custom Web Parts, you will also gain a lot if you understand how OOTB SharePoint Web Parts behave and how you can leverage them. You shouldn't have to rely on the Functional Consultant to tell you what's there OOTB to solve the problem you have in hands. You will provide more and better value by knowing it yourself and being able to take advantage of everything that's already there.

This may seem intuitive, but it is hardly applied on the real world. The example I gave here is about a SharePoint Developer knowing the OOTB platform, but also applies even within each role's responsibilities. For instance, SharePoint Functional Consultants who have been largely involved in Collaboration projects will gain huge value if they understand and know what's there in terms of Web Content Management and Enterprise Content Management. Because when you deploy a SharePoint farm with a Standard or Enterprise licence, you are offering you customer a very wide range of functionality that can be used for multiple purposes.

The best example I have on this is the Publishing Feature. All the Collaboration projects I have worked in so far, with a Standard/Enterprise licence, I have recommended having that feature enabled. The functionality it provides OOTB is easy to understand and, even if Pages are not the core of the project, this feature provides a number of advantages that will always be very useful.

Now, going back to the SharePoint Developer. If he focus more on the platform and less on code, what will he lose?

He will obviously be a less experienced coder than someone who develops 24/7 using C#, Java or Ruby on Rails. Maybe his patterns and best practices knowledge won't be as deep as those guys. He should still invest on that, but not as a "full time" thing. Balance both worlds is the best approach, at least if he is committed to developing for platforms. There is a high number of SharePoint projects that involve zero coding and a number of them where you'll end up with just 3 or 4 pages of code.

Knowing where to place yourself to maximize the value you provide is key.

Wednesday, February 1, 2012

SharePoint 2010 Tip #7: Creating Site Collection in new Content Database

Another quick but efficient tip, this time for SharePoint 2010: creating a Site Collection in a different or new Content Database.

One would have guessed that would be very simple. Maybe you'd have a choice in the Site Collection creation interface. And in that choice, you'd be able to choose which Content Database to use, or even creating a new one. But no, there is no option whatsoever in the Site Collection creation interface regarding Content Databases.

Well, although it's not that intuitive, it is easy. Here's what you need to do.

First, you need to create the new Content Database. You can do it in the Central Administration interface for that purpose, very simple and no caveats there.

After having the target Content Database created, before you create your Site Collection, you need to get all the other Content Databases in that Web Application status to Offline, leaving your new Content Database as the only one with status set to Ready.

Now, just create the Site Collection and it will use your Content Database, as it is the only one with Ready status.

Don't forget to re-set all the other Content Databases status to Ready in the end.