Enable Users to create sub-keys in the registry

Enable users to create sub-keys in the registry

Users are unable to create new registry keys if they are not local administrators. This is a problem incase you have feautres which need to create their own application log categories or similar. This leads to an error being thrown and the feature is not activated.

This problem was fixed by editing the registry.

  1. Log on to the computer as an administrator.
  2. Click Start, click Run, type regedit in the Open box, and then click OK. The Registry Editor window appears.
  3. Locate the following registry subkey:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Eventlog

  1. Right-click Eventlog, and then click Permissions. The Permissions for Eventlog dialog box appears.
  2. Click Advanced. The Advanced Security Settings for Eventlog dialog box appears.
  3. In the Name column, double-click the Users group. The Permission Entry for Eventlog dialog box appears.
  4. Select the Set Value check box, select the Create Subkey check box, and then click OK.
  5. Quit Registry Editor

iFilter for PDF files SharePoint 2010 Crawl

In SharePoint 2010 we can use iFilters to extend the functionality of the search engine. In this post I will talk about iFilters but more specifically about how you can ensure that your PDF files are crawled by the SharePoint search. You can read more about iFilters at http://technet.microsoft.com/en-us/library/gg405170.aspx

The first step here is to add the icon for the PDF files. You do not need to do this step if you do not wish to add the icon to your SharePoint environment.

Installing the PDF icon

  • First you need to download the PDF icon. You can find this at http://www.adobe.com/misc/linking.html#pdficon
  • We then need to add this icon to SharePoint.
  • find the file DOCICON.XML in your 14-hive folder (14\TEMPLATE\XML\)
  • Search for the following line <Mapping Key=”pdf”
  • If this line exists you already have the icon and can move to the next step, if it doesnt exist you should add the following line inside the tag:
    pdf” Value=”pdficon_small.png” /> The value here is simply the name of the pdf icon file (the standard name is pdficon_small.png) you can change this if needed.
  • Now we have told SharePoint to look for the image pdficon_small.png when it finds a PDF document so the last thing we need to do is to actually add the image somewhere where SharePoint can find it.
  • Open \14\TEMPLATE\IMAGES\ and simply add the pdficon_small.png to that folder.

Installing the iFilter

Now that we have the icon for PDF files setup we need to add the actual iFilter which our crawl will use.

The iFilter is now installed on the server but we still need to tell SharePoint to use it.

  • Open Central Administration and navigate to the Search Service Application
  • From the left-hand menu select “File Types”
  • Click on “New File Type”
  • Enter “pdf” as the extenssion and press Ok

Now we need to perform an IIS-reset in order for the changes to work (Warn your users before you do this since their sessions will be terminated)

  • Start the CMD-prompt [Start] -> [All programs] -> Accessories – > Command Prompt
  • Type iisreset then press enter
  • Type NET STOP OSearch14 then press enter
  • Type NET START OSearch14 then press enter

You can now crawl your pdf files (Start a full crawl)

 

Note:

It is worth to mention that there are commerical iFilters as well that will crawl your files much faster. The free iFilter from adobe will only crawl one PDF at the time so if you are experiencing problems with the time it takes to crawl your farm due to there being a lot of PDF files you might want to look into the iFilters you can buy for PDF.

 

Load Control Template file (TaxonomyPicker.ascx failed) – ERROR

If you get the following error:

Load control template file /_controltemplates/TaxonomyPicker.ascx failed: Could not load type ‘Microsoft.SharePoint.Portal.WebControls.TaxonomyPicker’ from assembly ‘Microsoft.SharePoint.Portal, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c’.

You need to check that the TaxonomoyPicker.ascx file has not been modified

  1. Open your 14-hive folder (C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14)
  2. Open the TEMPLATE folder
  3. Open the CONTROLTEMPLATES folder
  4.  Find the TaxonomyPicker.ascx file and open it in notepad
  5. On the first line check if you have &#44 instead of ,

<%@ Control className=”TaxonomyPickerControl” Language=”C#” Inherits=”Microsoft.SharePoint.Portal.WebControls.TaxonomyPicker,Microsoft.SharePoint.Portal, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c” %>

Replace ‘&#44’ with ‘,’ , the correct line should look like

<%@ Control className=”TaxonomyPickerControl” Language=”C#” Inherits=”Microsoft.SharePoint.Portal.WebControls.TaxonomyPicker, Microsoft.SharePoint.Portal, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c” %>

Save the file, the error should be gone now.

Database Info

This is just a list of some of the Databases used in most SharePoint 2010 farms

Configuration Database (SharePoint_Config)

The configuration database is a small database (1GB or less) the transaction logs are however likely to grow quickly due to the amount of changes that take place. It needs to be read a lot but does not require to be written to as much. The database contains data about the SharePoint farm; it is what the entire farm relies on for general settings relating to databases, IIS web applications, site templates, web applications, default quotas etc.

Size information Small (1GB). The transaction logs are however likely to grow
Read/Write Read-heavy
Scaling Scale up. Only one Configuration database per farm is allowed.
Default recovery model Full. It’s recommended to set the configuration database to simple recovery to restrict the growth of the log file.

Central Administration Content (Central_Admin_Content)

The Central Administration Content is a small database (1GB or less) if you do however use Power Pivot it will grow more. It stores information about all site content, meaning all documents and files, list data, web part properties. It also host all information about user accounts, service accounts etc. It has varying read/write characteristics meaning that we cannot determine if it’s more read or write intensive.

Size information Small (1GB or less). If Power Pivot is used it will grow larger.
Read/Write Varies
Scaling Scale up. Only one Central admin allowed per farm
Default recovery mode Full

Content Databases (WSS_content)

It is strongly recommended to limit the size of the content databases to 200GB to ensure system performance. Its read/write characteristics will vary depending on the content hosted in the database. In a collaboration site it will be more write-intensive and in a document management environment it will be more read-intensive. In certain cases the content databases can be up to 1TB but only if the site is a repository that archives relatively static data (document centers etc)

Size information Recommended maximum (200GB)
Read/Write Varies
Scaling Any content database which is hosting a site collection must be scaled up as we cannot split a site collection across two databases. It is however recommended to create new site collections in the same web application with their own Databases as the recommended size of a database is 200GB. If a Content database is hosting multiple site collections it is recommended to move the site collections to their own databases.
Default recovery mode Full

Usage and Health Data Collection database (WSS_UsageApplication)

The usage and health database is used to store health monitoring and usage data temporary and is used

Location requirements This is a very active database that should be put on a separate disk if possible.
Size information Extra-large (1TB or more). The size depends on retention time and amount of objects being monitored.
Read/Write Very write heavy
Scaling To usage and health must scale up as only one logging database is allowed per farm.
Default recovery mode Simple

Business Data Connectivity database (BDC_Service_DB)

The BDC database stores external content types and related objects

Size information Small (1GB or less) depends on amount of connections.
Read/Write Very read-heavy
Scaling Must scale up, only one BDC allowed per farm
Default recovery mode Full

State Database (StateService)

The state database stores temporary state information for InfoPath, chart web parts and the Visio service.

Size information Medium-Large (100GB – 1TB). The size is determined by the use of InfoPath and Visio
Read/Write Varies
Scaling Scale out. Add additional state databases using PowerShell
Default recovery mode Full

Web Analytics Staging database

The staging database stores un-aggregated fact data, asset metadata and queued batch data temporarily for the Web Analytics service application

Size information Medium (100GB). The size is determined by the number of reports being generated.
Read/Write Varies
Scaling Scale out. Add additional staging databases with the service application instance.
Default recovery mode Full

Web Analytics Reporting database

The reporting database stores the aggregated report tables, fact data from group sites, date and asset metadata, and diagnostics information for the web analytics service.

Size information Extra-Large (1TB or more). The size is determined by the retention policy of the data.
Read/Write Varies
Scaling Scale out. Add additional staging databases with the service application instance.
Default recovery mode Full

Search Service Application Administration Database

This database hosts the search service application configuration and access control list (ACL) and the “best bets” for the crawl component. This database is accessed for every user and administrative action.

Location Requirements The administrative database should fit into the RAM on the server so that it can handle the end-user query load as efficiently as possible. The Administration and Crawl databases should not be located on the same server.
Size information Small – Medium (1GB – 100GB) The size is determined by the amount of best bets, the number of content sources and crawl rules, the security descriptions for the corpus and the amount of traffic.
Read/Write Equal
Scaling The database must be scaled up. Additional instances can be created but each instance can only host one database.
Default recovery mode Simple

Search Service Application Crawl Database

The crawl database stores the state of the crawled data and the crawl history. In large scale environments it is recommended to run the crawl database on a server that is uses SQL 2008 Enterprise edition so that the data compression can be used.

Location Requirements The Crawl database is very I/O intensive and causes the SQL cache to be flushed regularly. The crawl database should not be hosted on the same server as the databases involved in end-user tasks (Property database, content databases etc)
Size information Medum – Large (100GB – 1TB). The number of items in the Corpus determines the size of the database
Read/Write Read-Heavy ration Read 3:1 Write
Scaling Scale out. Associate another crawl database with the service application instance. Multiple Crawl databases can be placed on the same server as long as it can handle the I/O per second required.
Default recovery mode Simple

Search Service Application Property Database

The property database stores information that is associated with the crawled data, including properties, history and crawl queues. In larger deployments it’s recommended to use a SQL 2008 Enterprise server so that data compression can be used.

Location requirements At least one-third of the Property database should fit into the RAM on the Server. In large-scale deployments its recommended to this database is hosted on its own server to achieve faster query results.
Size information Large – Extra Large (1TB or more). The size is determined by the amount of managed properties and the number of documents.
Read/Write Write-heavy. The ration is Read 1:2 Write
Scaling Scale out. Connect another property database with the service application. Each additional property database should be hosted on a different server.
Default recovery model Simple

User Profile Service Application Profile Database

The profile database stores and manages all users and information associated with the users. It also stores information about the user’s social network and memberships for sites and lists.

Size information Medium – Large (100GB – 1TB) Determined by number of users, the use of news feed, retention time.
Read/Write Read-heavy
Scaling Scale up. Additional instances of the service application can be created but generally because of business not size.
Default recovery model Simple

User Profile Service Application Synchronization Database

The database stores the configuration and staging data for use when profile data is being synchronized with the active directory.

Size information Medium – Large (100GB – 1TB). Grows with more users and groups.
Read/Write Equal
Scaling Scale up. Additional instances of the service application can be created but generally because of business not size.
Default recovery model Simple

User Profile Service Application Social Tagging database

The social tagging database stores social tags and notes created by users, as well as their urls.

Size information Small – Large (1GB – 1TB). Grows when additional tags, ratings and notes are created and used.
Read/Write Read-heavy. The ratio is Read 50:1 Write
Scaling Scale up. Additional instances of the service application can be created but generally because of business not size.
Default recovery model Simple

Managed Metadata database

The managed metadata database stores the managed metadata and the syndicated content types.

Size information Medium (100GB). The amount of managed metadata determines the size
Read/Write Read-Heavy. The ration is Read 1000:1 Write
Scaling Scale up. Additional instances of the service application can be created but generally because of business not size.
Default recovery model Simple

Enable Developer Dashboard

In order to enable the Developer Dashboard you can use the following powershell script:


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

$devboard = [Microsoft.SharePoint.Administration.SPWebService]::
ContentService.DeveloperDashboardSettings

$devboard.DisplayLevel = "On"

$devboard.RequiredPermissions = [Microsoft.SharePoint.SPBasePermissions]::FullMask

$devboard.TraceEnabled=$true

$devboard.Update()

You can change the third line depending on if you want the dashboard to be always on, off or on demand. If it is on demand the user has to press the developer dashboard icon in the top right corner to enable the dashboard for that given page.


$devboard.DisplayLevel = "On"

$devboard.DisplayLevel = "Off"

$devboard.DisplayLevel = "OnDemand"

In addition to just enabling the developer dashboard you can also customize what permission level a user has to have to be able to use the dashboard. The value used here ensures that it will only be available for administrators


$devboard.RequiredPermissions = [Microsoft.SharePoint.SPBasePermissions]::FullMask

So go out, use the developer dashboard and make sure you gain control of your environment 🙂

Indexing Gantt Views in SharePoint

When SharePoint tries to index pages with Gantt views you will get an error similar to:

Microsoft.SharePoint.SPException: This view requires at least Microsoft Internet Explorer 7.0, Mozilla FireFox 3.0, or Apple Safari 3.0.

This is bug in SharePoint which makes it impossible for the index server to index the Gantt view due to the fact that its user agent is not permitted to crawl the content.

 In order to fix this we need to modify the registry on the index server as follows:

  •  Start Regedit
  • Find the following key

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\14.0\Search\Global\Gathering Manager\

  • In this key locate the User Agent value

 

  • Change the value from: 
    Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 6.0 Robot)

    To:

    Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3; .NET4.0C; MS-RTC LM 8; Tablet PC 2.0)
  •  Restart the server.

 You should now be able to index these pages without any problems.

PRC Server Unavailable Error

If your event log shows errors relating to the RPC Server being unavailable (COM Exception 0x800706BA, Event ID 3), like the one below.

 

Then you most likely need to start the remote procedure call service. Start the services.msc from the Start -> Run menu

Locate the Remote Procedure Call (RPC) services and

1. Make sure that they can both start automatic and

2.Make sure that they are running. (You could also try to restart the services if they are running and you still get the same error).

Starting/restarting these should solve the problem.

Could not find stored procedure ‘dbo.Search_GetRecentStats’

This error is generally due to the fact that the Search service application cannot write to the usage and health database.

Event ID: 6398

The Execute method of job definition Microsoft.Office.Server.Search.Monitoring.HealthStatUpdateJobDefinition (ID 9cb6be54-0384-4c6e-abfc-c2f25621a3ed) threw an exception. More information is included below.

Could not find stored procedure ‘dbo.Search_GetRecentStats’.

Event ID: 5586

Unknown SQL Exception 2812 occurred. Additional error information from SQL Server is included below.

Could not find stored procedure ‘dbo.Search_GetRecentStats’.

This is most likely either because the usage and health service does not have Health Data collection enabled. Go to central admin and under monitoring select Configure usage and health data collection. Select both the Enable usage data collection and the Enable health data collection.

Then perform an IISReset and restart the SPTimerV4 service.

(If these are already on, you need to disable them first then enable them again. Make sure to perform an IISReset and restart the SPTimerV4 service in between.)

If you have problem to disable the “Enable health data collection” you will have to stop some timer jobs:

Go to “monitoring” and to “Configure usage and health data collection”. Go to the “enable health data collection” and click on “health logging schedule”

Disable the following timer jobs (Diagnostics Data Provider: Performance Counters – Database Servers and UserProfileServices- User Profile to SharePoint Quick Syncronization):

Go to “monitoring” and  “Review job definitions”

Disable this timer job (Health Staticsics Updaing):

Now you should be able to disable the  “Enable health data collection”