22/08/2014

SharePoint Solution Setup Package MSI

After a few more hours of head banging VS2010 setup projects I decided there has to be an easier way.
It doesn't help that the custom actions can only be dll, exe and vbs, or that the post build command interface is a bit lousy - when it works!

I was toying with the idea of running a vbs to call the powershell to install and uninstall a wsp file, but we are dealing with 64bit and vbscript calls to ps files don't play nice with 64bit!

So I did a little hunting around for a solution and found the wonderful PS2EXE on codeplex.

The plan was to convert my ps1 files to exe's and then have my msi project call them as exe's...straightforward you would think....perhaps.

So here's how I progressed :

1. Make sure you put full paths into your ps1 files - you can't use contextual paths as the installer runs in the system32 context.


2. Covert your ps1 files to exe's using the PS2EXE

3. Add your files into your VS setup project


4. Add your exe files created from PS2EXE into the custom actions sections


5. Make sure you mark your custom actions files as false for the installerclass


6. Ensure your target platform is set to 64bit



So when you run your setup.exe the final step of the install will execute your powershell files that have been converted to exe's.
Although this is probably not the best solution for some powershell files, its perfect for solutions (wsp's).

Hope this saves your msi frustrations!




12/06/2014

SshhhharePoint

Many of my engagements with customers and indeed other consultancies invariably have an 'intranet' element within the requirements.

I remember a while back when SP2010 was released and many of use realised that demonstrating the OOB user interface was not the best approach; where the customers response was invariably muted and unimpressed.

So I have been considering how we approach SharePoint engagements and indeed how we could approach them with a more practical business approach rather than a technical one.

Many of us are 'techies' - SharePoint solutions architects, SharePoint solutions developers and so on.
Many of us are also heavily engaged in SharePoint presales and SharePoint customer demos.

One thing we all have in common to some degree is that we are proud to be 'SharePoint' people....we love the product, love the architecture, love its nuances and all of the rich business functionality that it provides.

So it comes as no surprise that when we engage with customers we engage with them with our SharePoint hat on.
We are going to solve our customers business problems, inefficiencies and overheads with a SharePoint solution....and we tell them that.

But should we take our SharePoint hat off?

The phrase or word 'SharePoint' has many connotations depending on who you engage with. In some information management communities it is treated as the black death, in others it is the fix for world hunger, world peace....you get the idea!

When a user fills in their timesheets then don't think 'My timesheet has been created by system X'.
So it makes sense that when a user interacts with their intranet or information management system they don't consider what system it has been created on.

What is important to the user is that it makes their lives easier, enables them to work more efficiently and to be more productive during their working day.

What is important to the business is more efficient users = more efficient business = cost reduction through efficiency.

So what is the best approach with a SharePoint solution for the client?

We should be engaging with the client without SharePoint specifics...rather with solution specifics.

Should the user care that it's SharePoint...absolutely not.
Should the user know that it is SharePoint ...it shouldn't matter what it is.

We develop our thoughts around a solution based upon SharePoint and that's a given as we are going to use SharePoint as our platform of choice.

But we often communicate the solution in terms of SharePoint but is that really the best way to present a solution?

Perhaps we should present the solution as a solution...I know that sounds obvious but in reality many of use stick our SharePoint hat on and start talking SharePoint solutions; SharePoint specifics.

Obviously this may depend on the level of engagement, but how often does a records manager, a CEO or an information worker have to know the underlying technology?

Sure the IT team will have a vested interest but engaging with the end users and information workers shouldn't require a SharePoint hat, in fact it may just muddy the waters.

Is that really SharePoint?
Does it really matter?









24/10/2013

Defrag SharePoint DB Indexes

Been doing some research into the defrag rates of SharePoint content DB's in order to maximise performance

I ran some sime commands to find out the level of defragmentation....

DBCC SHOWCONTIG - this will show you the level of defragmentation in your DB's



DBCC SHOWCONTIG scanning 'WorkflowAssociation' table...
Table: 'WorkflowAssociation' (1490104349); index ID: 1, database ID: 7
TABLE level scan performed.
- Pages Scanned................................: 5
- Extents Scanned..............................: 5
- Extent Switches..............................: 4
- Avg. Pages per Extent........................: 1.0
- Scan Density [Best Count:Actual Count].......: 20.00% [1:5]
- Logical Scan Fragmentation ..................: 80.00%
- Extent Scan Fragmentation ...................: 80.00%
- Avg. Bytes Free per Page.....................: 2001.2
- Avg. Page Density (full).....................: 75.28%


The following SQL will allow you to set the defrag limit and defrag all your tables.

*Perform a 'USE ' to select the database in which to run the script.*/
-- Declare variables
SET NOCOUNT ON;
DECLARE @tablename varchar(255);
DECLARE @execstr varchar(400);
DECLARE @objectid int;
DECLARE @indexid int;
DECLARE @frag decimal;
DECLARE @maxfrag decimal;

-- Decide on the maximum fragmentation to allow for.
SELECT @maxfrag = 30.0;

-- Declare a cursor.
DECLARE tables CURSOR FOR
SELECT TABLE_SCHEMA + '.' + TABLE_NAME
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE';

-- Create the table.
CREATE TABLE #fraglist (
ObjectName char(255),
ObjectId int,
IndexName char(255),
IndexId int,
Lvl int,
CountPages int,
CountRows int,
MinRecSize int,
MaxRecSize int,
AvgRecSize int,
ForRecCount int,
Extents int,
ExtentSwitches int,
AvgFreeBytes int,
AvgPageDensity int,
ScanDensity decimal,
BestCount int,
ActualCount int,
LogicalFrag decimal,
ExtentFrag decimal);

-- Open the cursor.
OPEN tables;

-- Loop through all the tables in the database.
FETCH NEXT
FROM tables
INTO @tablename;

WHILE @@FETCH_STATUS = 0
BEGIN
-- Do the showcontig of all indexes of the table
INSERT INTO #fraglist
EXEC ('DBCC SHOWCONTIG (''' + @tablename + ''')
WITH FAST, TABLERESULTS, ALL_INDEXES, NO_INFOMSGS');
FETCH NEXT
FROM tables
INTO @tablename;
END;

-- Close and deallocate the cursor.
CLOSE tables;
DEALLOCATE tables;

-- Declare the cursor for the list of indexes to be defragged.
DECLARE indexes CURSOR FOR
SELECT ObjectName, ObjectId, IndexId, LogicalFrag
FROM #fraglist
WHERE LogicalFrag >= @maxfrag
AND INDEXPROPERTY (ObjectId, IndexName, 'IndexDepth') > 0;

-- Open the cursor.
OPEN indexes;

-- Loop through the indexes.
FETCH NEXT
FROM indexes
INTO @tablename, @objectid, @indexid, @frag;

WHILE @@FETCH_STATUS = 0
BEGIN
PRINT 'Executing DBCC INDEXDEFRAG (0, ' + RTRIM(@tablename) + ',
' + RTRIM(@indexid) + ') - fragmentation currently '
+ RTRIM(CONVERT(varchar(15),@frag)) + '%';
SELECT @execstr = 'DBCC INDEXDEFRAG (0, ' + RTRIM(@objectid) + ',
' + RTRIM(@indexid) + ')';
EXEC (@execstr);

FETCH NEXT
FROM indexes
INTO @tablename, @objectid, @indexid, @frag;
END;

-- Close and deallocate the cursor.
CLOSE indexes;
DEALLOCATE indexes;

-- Delete the temporary table.
DROP TABLE #fraglist;
GO

02/05/2013

SharePoint Records Repository & In Place Records Thoughts Part 2

Following on from Part 1

Lets consider a simple example of in place v central records when it comes to aggregated content.

So imagine we have an image that will constitute a record at some point within it's lifecycle.
The sensible thing to do is create a content type for the image and define an appropriate information management policy etc.

So we are ready for the image to be published and viewed within our organisation. We may go through some approval or workflows and finally when all the business processes have been met, we declare it as a record....fine, that's the correct thing to do.

So we have a library containing our image content types and we use an image from the library in some content within our organisation - in this simple example we are using it within a publishing page layout.

So here's our image in all it's glory.


And here is our image content type in its document library.


So we now want to declare this as a record...fine.

In a central repository model we simple send to > records center and we are left with a nice link.
Great its now a record in our central repository.


And back to the home page.
Oh!!!!!

If we are using in place records management we can simple declare the record in situ.



And our image will not be removed.

Although this is a simple example it is an effective one.
Any content within your organisation is potentially a record, be it images, documents, etc etc.
As it is business content then it will probably come under the remit of some records management policy.

So when planning your records management architecture it is worth bearing in mind how you will aggregate any records-based content to other/all areas of your farm.

In place or central repository?
Plan, plan, plan !!!!!








SharePoint Event Receivers & Expiration Policy

More event receiver fun this week.

I've been writing some synchronous event receivers this week, one of which has been causing a headache, name ItemDeleting.

The event receiver acts correctly when users manually delete an item...great
The event receiver also acts correctly when a workflow deletes an item...great

If an item has an information management policy attached to it 'permanently delete' or 'move to recycle bin', the Expiration Policy timer job will kick in and delete the item.



BUT the event receiver will not be fired when the timer job executes!!!!!!

Cause : when the Expiration Policy timer job deletes an item it actually does an SPItem.SystemUpdate(); rather than SPItem.Update();


Worth bearing in mind if you need to capture an event driven from the Expiration policy timer job.

23/04/2013

SharePoint Records Repository & In Place Records Thoughts Part 1

Something I've been thinking about for a while now and finally decided to get something on record (see what I did there).

Records management in SharePoint is an emotive subject and I'm not going to cover the rights and wrongs of RM implementation.

I have had many discussions on the implementation of SharePoint records management with regards to In Place Records and a Central Repository approach, and while I agree each has it merits, one thing that many people don't consider is what actually constitutes a record and how do we manage a record once we realise that it's not just about documents.

So firstly lets identify what a record actually is......

Records Management ISO defines records as:
Information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business.

ICA's definition of a record as:
recorded information produced or received in the initiation, conduct or completion of an institutional or individual activity and that comprises content, context and structure sufficient to provide evidence of the activity.

In both the above definitions a record is defined as a piece of information but for some reason we all
think of a record as a document or group of documents. e.g. word, pdf, cad, excel etc etc.

This is where the SharePoint content types really do come to the fore. As we all know a content type can be any piece of information within a farm.

So it's easy...lets create a content type for each piece of information (impractical but possible).
Lets take a scenario.
Great Corp have an intranet and like most intranets they have news on the homepage.

Now this news may be an announcement content type or a custom content type, but very rarely is it a document, but it is a record.

But hey that's fine we will create a Great Corp News content type and use that as our news information container.

We'll even put our records managers hat on and give it an information policy, auditing, expiration and a workflow.

See it's dead easy given the wonderful functionality that SharePoint gives us.


So now we can add the news, as a content type, to our homepage.



Ok so now lets imagine that this piece of news is important, in fact its vital and can be classed as a vital record.

It may be a safety related piece of news or something similar which has to be managed following our formal records management process.

So lets declare it as a record, it's the sensible thing to do.

Scenario 1 - In Place Records

We can declare the news item (vital record) as a record using in place records management.


The vital record news item is now locked and cannot be edited or changed, which is absolutely correct.

Scenario 2 - Central Records Repository

Ok so lets declare this news item as a record inside our central repository

Oh wait a minute, we can only send document based records to our central repository.

OK it's not the end of the world, we can use our content type information management policy, workflow or similar to move it to the records centre.



We will have to move it and leave a link. If we copy it we will have two identical records in two different places in our farm...not good!

Gotcha....although the information management policy will let us set this rule, you cannot transfer a non document derived content type to the records centre!!!!


In conclusion

I understand the reasons that many organisations want to use a central repository for all of their formal records. However you should consider the following:

  • Much of your content within SharePoint will be list based and not necessarily document based.
  • Your information architecture policy should reflect this and you should identify the information that will constitute a record, not just the documents
  •  Declaration of records should take into consideration the limitations of a central repository without code and add-ons.

In part 2 I'll look at the technical issues that have to be overcome for central and in place records management....

01/11/2012

WSS Emailing Itself -: x-mailer: Windows SharePoint Services (version 3)

I had a case recently where a client wanted to send access request emails to an email enabled document library. We encountered the problem of WSS not being able to handle the x-mail header.
A solution was to create a vbscript that fired when an email entered the 'Drop' folder of IIS mailroot and stripped out the problematic header.
Initial testing proved successfull and this has now been implemented.
solution
1. copy Smtpreg.vbs into the c:\inetpub\AdminScripts - smtpreg is available to download from MS.

2. create a vbs file from the script below and copy it to the AdminScripts folder (I named mine wss)

<SCRIPT LANGUAGE="VBScript">


Sub ISMTPOnArrival_OnArrival(ByVal iMsg, EventStatus )

if iMsg.Fields("urn:schemas:mailheader:x-mailer") = "Windows SharePoint Services (version 3)" then
iMsg.Fields.Delete("urn:schemas:mailheader:x-mailer")
iMsg.Fields.Update
end if

iMsg.DataSource.Save
EventStatus = 0
End Sub




3. run the following commands to register the script

cd c:\inetpub\adminscripts
cscript smtpreg.vbs /add 1 OnArrival DeleteMsg CDO.SS_SMTPOnArrivalSink "mail from=*"
cscript smtpreg.vbs /setprop 1 OnArrival DeleteMsg Sink ScriptName "c:\Inetpub\AdminScripts\wss.vbs"

4. This will now catch all mail on arrival and remove the offending header.