Monday, November 23, 2009

Longest Running Procedures in Sql Server

Its useful to determine which SProc/Code have the greatest impact on the Server. Sometimes, that is determined by examining the I/O cost,sometimes by the Exectution Duration. In this eg, Total Impact is determined by examining the length of execution and the frequency of execution.

SELECT TOP 10
temp.text as ProcedureName,
s.execution_count as ExecutionCount,
isnull( s.total_elapsed_time / s.execution_count, 0 ) as AvgExecutionTime,
s.total_worker_time / s.execution_count as AvgWorkerTime,
s.total_worker_time as TotalWorkerTime,
s.max_logical_reads as MaxLogicalReads,
s.max_logical_writes as MaxLogicalWrites,
s.creation_time as CreationDateTime,
s.total_physical_reads as PhysicalReads,
isnull( s.execution_count / datediff( second, s.creation_time,getdate()), 0 )as CallsPerSecond
FROM sys.dm_exec_query_stats s
CROSS APPLY sys.dm_exec_sql_text( s.sql_handle ) temp
ORDER BY
-- s.total_elapsed_time DESC
s.execution_count desc

Friday, November 20, 2009

Transaction Log growing in Sql Server

Transaction log in SQL Server is one of the most important parts of a SQL Server , as well as one of the most common generators of problems I see online.The following are the causes for transaction log growth,
 
1.Due to Uncommitted transactions. We can find if there are any open transactions using DBCC OPENTRAN.
2.Running DBCC REINDEX, CREATE INDEX operations with the database in Full Recovery model.
3.Running extremely large transactions like Bulk Insert, Select Into commands.
http://support.microsoft.com/kb/317375/
 
The following are the proactive measures in order to minimize the unexpected log file growth,
 
1.If you do not want point in time recovery of your databases then you can change the recovery model to Simple.
2.Set the size of the transaction log files to a large value to avoid the automatic expansion of the transaction log files.
3.Configure the automatic expansion of transaction log in terms of MB instead of %.
4.Backup the transaction log regularly to delete the inactive transactions in your transaction log if you are using full or bulk logged recovery model.
5.You can switch the recovery model to Bulk logged from full recovery model if you perform some bulk inserts, select into, bcp, alter index, create index commands because these operations will be minimally logged in bulk logged recovery model and after those operations are completed you can switch over to full recovery model.
 

Solutions
  1. Take full backup of your database
  2. Take t-log backup of your database frequently say every 30 or 15 minutes so that log file will not grow drastically
  3. Shrink if you do not have any free space. You can perform this operation manually if required.
  4. Generally avoid shrinking the database and keep it as the last option.
If you are in full recovery, then no it won't be truncated.run the following command:
 

SELECT DATABASEPROPERTYEX('Lorenzo347', 'RECOVERY')

If it returns FULL or BULK_LOGGED, then you will have to backup the log, either to disk or specifying the truncate only to get it truncate the space off.  If you do that, then you might as well change your recovery model to SIMPLE and be done with this problem.  If it says you are already in SIMPLE, then it should shrink without problem.
 
backup log <your database name> with truncate_only
 
Shrink Log files
 
DECLARE @LogFileName varchar(100)
SELECT @LogFileName = rtrim(name)
FROM dbo.sysfiles
WHERE Name like '%_log%'
dbcc SHRINKFILE(@LogFileName, 2)

Shrinks the log file  to 2 MB


http://msdn.microsoft.com/en-us/library/aa258824(SQL.80).aspx
  
 Note:
Detaching and deleting the log is definitely not advisable 
 At best it forces you to take your database offline. Worst case is that you invalidate your entire database and have to  restore from backup.
 
 
 

Monday, November 16, 2009

Index to be recreated in Sql Server

When your database grows, the index fragmentation becomes too high,that will scale down the performance of the sql server. To overcome that we need to re-build the index, that should be done at the down time of the server.Here there is a query that will list all the index to be recreated/rebuild.It uses the following tables(sys.indexes,sys.tables,sys.schemas,sys.dm_db_index_physical_stats,sys.partitions). For detailed checkup change the last NULL in the dm_db_index_physical_stats call to 'SAMPLED' or even 'DETAILED'


SELECT 'ALTER INDEX [' + ix.name + '] ON [' + s.name + '].[' + t.name +'] ' +
CASE WHEN ps.avg_fragmentation_in_percent > 40 THEN 'REBUILD'
ELSE 'REORGANIZE' END +
CASE WHEN pc.partition_count > 1 THEN ' PARTITION = ' +
cast(ps.partition_number as nvarchar(max)) ELSE '' END
FROM sys.indexes AS ix INNER JOIN sys.tables t
ON t.object_id = ix.object_id
INNER JOIN sys.schemas s
ON t.schema_id = s.schema_id
INNER JOIN (SELECT object_id, index_id,avg_fragmentation_in_percent, partition_number
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL,NULL, NULL, NULL)) ps
ON t.object_id = ps.object_id AND ix.index_id = ps.index_id
INNER JOIN (SELECT object_id, index_id, COUNT(DISTINCT
partition_number) AS partition_count
FROM sys.partitions
GROUP BY object_id, index_id) pc
ON t.object_id = pc.object_id AND ix.index_id = pc.index_id
WHERE ps.avg_fragmentation_in_percent > 10 AND
ix.name IS NOT NULL

Tuesday, November 10, 2009

VS 2010 Tranining kit

MSFT released the training kit(october preview) of VS 2010 framework.Can download it from Microsoft website.The Beta 2 release of the Training Kit contains 15 presentations, 19 hands-on labs, and 13 demos. Many technologies are covered in this release, including: C# 4, VB 10, F#, Parallel Extensions, Windows Communication Foundation, Windows Workflow, Windows Presentation Foundation, ASP.NET 4, Entity Framework, ADO.NET Data Services, Managed Extensibility Framework, and Visual Studio Ultim

Click here to dowload from Microsoft