Sunday, December 20, 2009

Restoring backup of Partitioned Database in sql server 2008

If we need to restore a database having some 'n' number of partition its very tedious to give all the partitioned filegroup to restore into a particular location in the database server.The below example we demonstrate to perform the task.Run the script in the master database.

IF Exists (SELECT '1' from sysobjects where name = 'RESTOREBKP_2008' and type ='p')
BEGIN
DROP PROCEDURE dbo.RESTOREBKP_2008
END
GO
CREATE PROCEDURE RESTOREBKP_2008
(
@DBName varchar(max), -- DATABASENAME
@BackupPath varchar(max), -- BACKUPFILEPATH
@Path varchar(max) -- PHYSICALPATH
)
AS
BEGIN

DECLARE
--Variables declaration
@ResFilesOnly Nvarchar(max),
@ResQuery Nvarchar(max),
@STR AS nVARCHAR(MAX),
@Logicalname as nvarchar(max),
@physicalname as nvarchar(max)

--- LOOP Variable
DECLARE @I AS INT, @J AS INT


set @ResFilesOnly='RESTORE FILELISTONLY
FROM DISK = '''@BackupPath''''

set @I = 1

DECLARE @RESTORE AS TABLE(
LogicalName nvarchar(128), PhysicalName nvarchar(500), Type char(1), FileGroupName nvarchar(128),
Size numeric(20,0), MaxSize numeric(38,0), FileID bigint, CreateLSN numeric(38,0),
DropLSN numeric(25,0) NULL, UniqueID uniqueidentifier, ReadOnlyLSN numeric(25,0) NULL,
ReadWriteLSN numeric(25,0) NULL, BackupSizeInBytes bigint, SourceBlockSize int,
FileGroupID int, LogGroupGUID uniqueidentifier NULL, DifferentialBaseLSN numeric(38,0) NULL,
DifferentialBaseGUID uniqueidentifier, IsReadOnly bit, IsPresent bit,TDEThumbprint varchar(500)
)
INSERT INTO @RESTORE
EXEC sp_executesql @ResFilesOnly

SELECT @J = MAX(FILEID) FROM @RESTORE
SET @STR = ''
WHILE @I <= @J
BEGIN
SELECT
@Logicalname = LTRIM(RTRIM(Logicalname)), @physicalname = LTRIM(RTRIM(REVERSE(LEFT(REVERSE(physicalname),CHARINDEX('\',REVERSE(physicalname))-1))))
FROM
@RESTORE
WHERE
FILEID = @I
SET @STR = @STR + 'MOVE N''' + @Logicalname + ''' TO N''' + @Path + @physicalname + ''', '

SET @I = @I + 1
END

--Restore Backup Query
set @ResQuery = 'RESTORE DATABASE ' + @dbname + ' FROM DISK = '''+ @BackupPath + ''' WITH '+ @STR +
N' NOUNLOAD, STATS = 10'

EXECUTE SP_EXECUTESQL @ResQuery
END


Execute this Stored Procedure in your master database as given below

EXEC RESTOREBKP_2008 '', '', ''

Note:-In case of any clarification please mail me to mail@renjucool.com

See this article in MSDN by me http://code.msdn.microsoft.com/Sql2008PartionedDb

Thursday, December 17, 2009

Sql Server Reporting Service –Silverlight enabled

For sometime before i’m thinking of the reports(SSRS/CR) to be enabled by silverlight technologies, so after googling i found out a amazing tool from perpetuumsoft to display SSRS reports in RIA enabled UI can do a deep zoom!!!!.Seems rocking!!!!

Debugging for Silverlight

Find out the RIA platform silverlight debugging tool from MSFT,whdc. also see this blog regarding the memory leak problem in silverlight.

 http://blogs.msdn.com/delay/archive/2009/03/11/where-s-your-leak-at-using-windbg-sos-and-gcroot-to-diagnose-a-net-memory-leak.aspx

Friday, December 4, 2009

Table Comparison in Sql Server

SQL Server is having a command line tool (TableDiff) to compare the data in two tables.It will perform the following task.

  1. A row by row comparison between a source table in an instance of Microsoft SQL Server acting as a replication Publisher and the destination table at one or more instances of SQL Server acting as replication Subscribers.
  2. Perform a fast comparison by only comparing row counts and schema.
  3. Perform column-level comparisons.
  4. Generate a Transact-SQL script to fix discrepancies at the destination server to bring the source and destination tables into convergence.
  5. Log results to an output file or into a table in the destination database.
eg:
 
 "C:\Program Files\Microsoft SQL Server\90\COM\TableDiff.exe" -sourceserver "SERVERNAME" -sourcedatabase "DBNAME" -sourceschema "dbo" -sourcetable "SOURCETABLE1" -sourceuser "sa" -sourcepassword "PASSWORD" -destinationserver "SERVERNAME" -destinationdatabase "DBNAME" -destinationschema "dbo" -destinationtable "SOURCETABLE2" -destinationuser "sa" -destinationpassword "PASSWORD" -dt -o "C:\Documents and Settings\renjuraj\My Documents\diff.xls"
 
Run the above command in command prompt.Please refer your sql server installation path(C:\Program Files\Microsoft SQL Server\90\COM\TableDiff.exe).

Thursday, December 3, 2009

Windows 7 Startup Animation Design

Rolf Ebeling, a senior user experience lead for the User Experience Design and Research Team for Windows, Windows Live, and Internet Explorer, is the man who designed the 105-frame Windows 7 boot animation that millions see or will see every day for years to come. Along with developing the famous boot animation, he also helped with the appearance and functionality of the calculator. He's already confirmed that he'll be working on the next version of Windows. Ebeling was only with the company for four months before he was asked to start designing what would become four swirling balls of light that come together to form a pulsing Windows 7 flag. Although the sketches of the early boot animation concept pictured above don't show it, Ebeling said he looked everywhere for inspiration, including street lights in the rain, light reflecting off water, and fireflies. A self-taught designer with a degree in English literature, Ebeling was a creative director for Newsweek.com in New York before joining Microsoft in April 2008, his first software job.

Source :Microsoft

Monday, November 23, 2009

Longest Running Procedures in Sql Server

Its useful to determine which SProc/Code have the greatest impact on the Server. Sometimes, that is determined by examining the I/O cost,sometimes by the Exectution Duration. In this eg, Total Impact is determined by examining the length of execution and the frequency of execution.

SELECT TOP 10
temp.text as ProcedureName,
s.execution_count as ExecutionCount,
isnull( s.total_elapsed_time / s.execution_count, 0 ) as AvgExecutionTime,
s.total_worker_time / s.execution_count as AvgWorkerTime,
s.total_worker_time as TotalWorkerTime,
s.max_logical_reads as MaxLogicalReads,
s.max_logical_writes as MaxLogicalWrites,
s.creation_time as CreationDateTime,
s.total_physical_reads as PhysicalReads,
isnull( s.execution_count / datediff( second, s.creation_time,getdate()), 0 )as CallsPerSecond
FROM sys.dm_exec_query_stats s
CROSS APPLY sys.dm_exec_sql_text( s.sql_handle ) temp
ORDER BY
-- s.total_elapsed_time DESC
s.execution_count desc

Friday, November 20, 2009

Transaction Log growing in Sql Server

Transaction log in SQL Server is one of the most important parts of a SQL Server , as well as one of the most common generators of problems I see online.The following are the causes for transaction log growth,
 
1.Due to Uncommitted transactions. We can find if there are any open transactions using DBCC OPENTRAN.
2.Running DBCC REINDEX, CREATE INDEX operations with the database in Full Recovery model.
3.Running extremely large transactions like Bulk Insert, Select Into commands.
http://support.microsoft.com/kb/317375/
 
The following are the proactive measures in order to minimize the unexpected log file growth,
 
1.If you do not want point in time recovery of your databases then you can change the recovery model to Simple.
2.Set the size of the transaction log files to a large value to avoid the automatic expansion of the transaction log files.
3.Configure the automatic expansion of transaction log in terms of MB instead of %.
4.Backup the transaction log regularly to delete the inactive transactions in your transaction log if you are using full or bulk logged recovery model.
5.You can switch the recovery model to Bulk logged from full recovery model if you perform some bulk inserts, select into, bcp, alter index, create index commands because these operations will be minimally logged in bulk logged recovery model and after those operations are completed you can switch over to full recovery model.
 

Solutions
  1. Take full backup of your database
  2. Take t-log backup of your database frequently say every 30 or 15 minutes so that log file will not grow drastically
  3. Shrink if you do not have any free space. You can perform this operation manually if required.
  4. Generally avoid shrinking the database and keep it as the last option.
If you are in full recovery, then no it won't be truncated.run the following command:
 

SELECT DATABASEPROPERTYEX('Lorenzo347', 'RECOVERY')

If it returns FULL or BULK_LOGGED, then you will have to backup the log, either to disk or specifying the truncate only to get it truncate the space off.  If you do that, then you might as well change your recovery model to SIMPLE and be done with this problem.  If it says you are already in SIMPLE, then it should shrink without problem.
 
backup log <your database name> with truncate_only
 
Shrink Log files
 
DECLARE @LogFileName varchar(100)
SELECT @LogFileName = rtrim(name)
FROM dbo.sysfiles
WHERE Name like '%_log%'
dbcc SHRINKFILE(@LogFileName, 2)

Shrinks the log file  to 2 MB


http://msdn.microsoft.com/en-us/library/aa258824(SQL.80).aspx
  
 Note:
Detaching and deleting the log is definitely not advisable 
 At best it forces you to take your database offline. Worst case is that you invalidate your entire database and have to  restore from backup.
 
 
 

Monday, November 16, 2009

Index to be recreated in Sql Server

When your database grows, the index fragmentation becomes too high,that will scale down the performance of the sql server. To overcome that we need to re-build the index, that should be done at the down time of the server.Here there is a query that will list all the index to be recreated/rebuild.It uses the following tables(sys.indexes,sys.tables,sys.schemas,sys.dm_db_index_physical_stats,sys.partitions). For detailed checkup change the last NULL in the dm_db_index_physical_stats call to 'SAMPLED' or even 'DETAILED'


SELECT 'ALTER INDEX [' + ix.name + '] ON [' + s.name + '].[' + t.name +'] ' +
CASE WHEN ps.avg_fragmentation_in_percent > 40 THEN 'REBUILD'
ELSE 'REORGANIZE' END +
CASE WHEN pc.partition_count > 1 THEN ' PARTITION = ' +
cast(ps.partition_number as nvarchar(max)) ELSE '' END
FROM sys.indexes AS ix INNER JOIN sys.tables t
ON t.object_id = ix.object_id
INNER JOIN sys.schemas s
ON t.schema_id = s.schema_id
INNER JOIN (SELECT object_id, index_id,avg_fragmentation_in_percent, partition_number
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL,NULL, NULL, NULL)) ps
ON t.object_id = ps.object_id AND ix.index_id = ps.index_id
INNER JOIN (SELECT object_id, index_id, COUNT(DISTINCT
partition_number) AS partition_count
FROM sys.partitions
GROUP BY object_id, index_id) pc
ON t.object_id = pc.object_id AND ix.index_id = pc.index_id
WHERE ps.avg_fragmentation_in_percent > 10 AND
ix.name IS NOT NULL

Tuesday, November 10, 2009

VS 2010 Tranining kit

MSFT released the training kit(october preview) of VS 2010 framework.Can download it from Microsoft website.The Beta 2 release of the Training Kit contains 15 presentations, 19 hands-on labs, and 13 demos. Many technologies are covered in this release, including: C# 4, VB 10, F#, Parallel Extensions, Windows Communication Foundation, Windows Workflow, Windows Presentation Foundation, ASP.NET 4, Entity Framework, ADO.NET Data Services, Managed Extensibility Framework, and Visual Studio Ultim

Click here to dowload from Microsoft

Wednesday, October 28, 2009

Getting Country List from CultureInfo

We can use CultureInfo to get the list of countries/languages.Today i got a method to list all the countries, even it has some duplicate values that are elimated by a distinct linq query.

public void CountryList()
{
ArrayList cList=new ArrayList();
foreach (CultureInfo ci in CultureInfo.GetCultures(CultureTypes.AllCultures & ~CultureTypes.NeutralCultures))
{
RegionInfo ri = new RegionInfo(ci.LCID);
cList.Add(ri.EnglishName);
}
var countries = cList.ToArray();
var i = (from temp in countries select temp).Distinct().OrderBy(s=>s);
foreach (var item in i)
{
listBox1.Items.Add(item.ToString());
}
}

Also an interesting thing with Textinfo to make the first letter of a string/sentence to capital.(Little bit tricky :-))

string temp = "MULTIDIMENSIONAL NEWTON RAPHSON";
string firstSmall = CultureInfo.CurrentCulture.TextInfo.ToTitleCase(temp.ToLower()); //I made a trick to make the string to lower case
MessageBox.Show(firstSmall.ToString());

Output :Multidimensional Newton Raphson

Tuesday, October 27, 2009

Workstation & Server Garbage Collector

There are two types of garbage collector,the workstation GC and Server GC.Workstation is the default on client versions of windows,but server is much faster on multicore machines.The Server GC utilize more memory
  

<configuration> <runtime>
<gcServer enabled="true"/>
</runtime>
</configuration>



Note:All sample code is provided is for illustrative purposes only.These examples have not been thoroughly tested under all conditions.Myself, therefore, cannot guarantee or imply reliability,serviceability, or function of these programs.

Sunday, October 25, 2009

Bugs in Sql Server

Today i got noticed a bug in sql server 2005(also in SSMS 08) while executing a simple select statement

select 3 --(*)

Msg 102, Level 15, State 1, Line 1

Incorrect syntax near '--(*'.

If we put a space or any other characters before the "(" there is no error. Also
select 3 --)* not returning error.Any way this is a bug in sql server as it does not parse the comment statements



Also i got an error message while Parsing an sql script as

."Net SqlClient Data Provider: Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should be discarded."


But the script is executing successfully and giving the accurate results!!!!

Wednesday, October 14, 2009

Last modified Db User Objects & Tables without Trigger in Sql server

If you need to check the details of all DB User Objects by Last Modified Date try the following query


select name, modify_date, case when type_desc = 'USER_TABLE'
then 'Table'when type_desc = 'SQL_STORED_PROCEDURE' then 'Stored Procedure'
when type_desc in ('SQL_INLINE_TABLE_VALUED_FUNCTION', 'SQL_SCALAR_FUNCTION', 'SQL_TABLE_VALUED_FUNCTION')
then 'Function'end as type_desc from sys.objects where type in ('U', 'P', 'FN', 'IF', 'TF')and is_ms_shipped = 0
order by 2 desc  
Find the below query that will list all tables without triggers in sql server.This is helpful when someone modify the table
make sure that there aren't any tables that have triggers on them so that stuff doesn't start to break after you make changes.
 
SELECT ob1.name FROM sysobjects ob1 LEFT JOIN sysobjects ob2 ON
ob1
.id =ob2.parent_obj AND ob2.xtype = 'TR'
WHERE ob2.name IS NULL AND ob1.xtype = 'U'
ORDER BY ob1.name

ELMAH (Error Logging Modules and Handlers)

HTTP modules and handlers can be used in ASP.NET to provide a high degree of componentization for code that is orthogonal to a web application, enabling entire sets of functionalities to be developed,packaged and deployed as a single unit and independent of an application. ELMAH (Error Logging Modules and Handlers) illustrates this approach by demonstration of an application-wide error logging that is completely pluggable. It can be dynamically added to a running ASP.NET web application, or even all ASP.NET web applications on a machine,without any need for re-compilation or re-deployment.The end-user(if allows) can view the exception/inner exception logged in the server,Its description and even the screen-shot of the
error page


It can be downloaded from
http://code.google.com/p/elmah/
http://msdn2.microsoft.com/en-us/library/aa479332.aspx
http://download.microsoft.com/download/E/9/7/E97DB6A9-3418-46DF-99CA-840
357FE4D72/MSDNElmah.msi

Tuesday, October 13, 2009

Using spt_values to generate junk datas

master..spt_values can be used if you need to populate a table with 100 numbers from 1,or if you need to generate some Junk/Test data.Refer the query below that will generate data of various datatypes

select
abs(checksum(newid()))%10000 as n_id,
abs(checksum(newid()))*rand()/10000 as f_float,
substring(replace(cast(newid() as varchar(36)),'-',''),1,abs(checksum(newid()))%15) as s_string,
dateadd(day,0,abs(checksum(newid()))%100000) as d_date,
abs(checksum(newid()))%2 as b_byte
from master..spt_values
where type='p' and number between 1 and 200

Monday, October 12, 2009

Getting all reference of an Assembly

To get the assembly reference in a ASP.NET web application i found a recursive method that will read from the csproject file.
Import these namespace also
using
System.Xml.Linq;
 
public void LoadAssembly(string path)
{
int result=0;
XNamespace msbuild = "http://schemas.microsoft.com/developer/msbuild/2003";
XDocument projDefinition = XDocument.Load(path);
IEnumerable<string> references = projDefinition.Element(msbuild + "Project")
.Elements(msbuild +
"ItemGroup")
.Elements(msbuild +
"Reference")
.Elements(msbuild +
"Name")
.Select(refElem => refElem.Value);
foreach (string reference in references)
{

richTextBox1.AppendText(reference);

richTextBox1.AppendText(Environment.NewLine);

        }
}
 
If you need to loop through the IEnumerable iterator to find out other xml nodes u can use the following
 
using (IEnumerator<string> enumerator = references.GetEnumerator())
{
while (enumerator.MoveNext())
result++;
}
if (result == 0)
{
//Do ur xml operations

}

Trigger to monitor the entire database server


Please find this trigger script to monitor the entire database activities and to log its records into audit tables.
It will work on Sql Server 2008.
 
USE [master]
-------Create table -------------------------
-- DROP TABLE AuditTrigger
CREATE TABLE [dbo].[AuditTrigger](
[OID] [bigint] IDENTITY(1,1) NOT NULL,
[Details] [varchar](max) NULL,
[Host] [varchar](20) NULL
)

GO
-----Create proc to insert into AuditTrigger table
IF EXISTS(SELECT 1 FROM sysobjects where name = 'PInsertAuditTrigger')
BEGIN
DROP PROCEDURE PInsertAuditTrigger
END
GO
CREATE
PROC PInsertAuditTrigger(@Details varchar(max),@hostname VARCHAR(MAX))
AS
BEGIN
INSERT INTO master.dbo.AuditTrigger(DETAILS,HOST) SELECT @DETAILS,@HOSTNAME
END
-------- Trigger to monitor tables
GO
CREATE
TRIGGER [Table_Trigger]
ON ALL SERVER
FOR
create_table,alter_table,drop_table
AS
BEGIN
print 'Create/Alter table is not allowed - '+host_name()
rollback;
DECLARE @host_name varchar(max),@Details varchar(max)
SELECT @host_name = Host_name(),@Details = 'Create/Alter table'
exec dbo.PInsertAuditTrigger @Details,@host_name
END
-------- Trigger to monitor procedures
GO
CREATE
TRIGGER [Proc_Trigger]
ON ALL SERVER
FOR
create_procedure,alter_procedure,drop_procedure
AS
BEGIN
print 'Create/Alter procedure is not allowed - '+host_name()
rollback;
DECLARE @host_name varchar(max),@Details varchar(max)
SELECT @host_name = Host_name(),@Details = 'Create/Alter procedure'
exec dbo.PInsertAuditTrigger @Details,@host_name
END
-------- Trigger to monitor login,roles and database
GO
CREATE
TRIGGER [Misc_Trigger]
ON ALL SERVER
FOR
Create_role,drop_role,alter_role,create_database,drop_database,alter_database,create_view,alter_view,drop_view,create_login,drop_login
AS
BEGIN
print 'Illegal operation is not allowed - '+host_name()
rollback;
DECLARE @host_name varchar(max),@Details varchar(max)
SELECT @host_name = Host_name(),@Details = 'Illegal operation'
exec dbo.PInsertAuditTrigger @Details,@host_name
END
GO

Friday, October 9, 2009

Removing all active connections from the db

Use Master
Go
Declare @dbname sysname
Set @dbname = 'DBNAME'
Declare @spid int Select @spid = min(spid) from master.dbo.sysprocesses
where dbid = db_id(@dbname)
While @spid Is Not Null Begin
Execute ('Kill ' + @spid)
Select @spid = min(spid) from master.dbo.sysprocesses
where dbid = db_id(@dbname) and spid > @spid
End

Thursday, October 8, 2009

Sql Server 2008-Spacial Data

Microsoft SQL Server 2008 delivers comprehensive spatial support that enables organizations to seamlessly consume, use, and extend location-based data through spatial-enabled applications which ultimately helps end users make better decisions.storing Lat/Long data in a geography datatype and being able to calculate/query using the functions that go along with it.It supports both Planar and Geodetic data.


•Use the new geography data type to store geodetic spatial data and perform operations on it

•Use the new geometry data type to store planar spatial data and perform operations on it

•Take advantage of new spatial indexes for high performance queries

•Use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio

•Extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications

DECLARE @g GEOMETRY
SET @g = 'LINESTRING ( 69 26, 69 23, 69 21, 67 20, 65 20,
63 18, 58 17, 52 17, 51 17, 49 17, 45 18, 44 20,
44 21, 42 26, 42 29, 42 32, 42 35, 43 35, 47 38,
50 41, 55 42, 58 42, 65 44, 66 44, 67 44, 68 45,
69 47, 70 48, 70 50, 71 51, 70 56, 68 60, 68 63,
66 65, 65 66, 63 68, 60 71, 59 71, 57 71, 55 71,
51 69, 45 65, 44 63, 42 62, 41 59, 41 57, 41 56,
41 54, 42 53 ,67 77 ,44 88)'
insert into location values(2,'renju',null,@g);

declare @p geometry
set @p= 'POLYGON ((30 30, 40 30, 40 40, 30 40, 30 30))'
insert into location values(3,'renju2',null,@p);

http://www.microsoft.com/sqlserver/2008/en/us/spatial-data.aspx

Sql Server Hidden Features - Part 2

A lot of SQL Server developers still don't seem to know about the OUTPUT clause (SQL Server 2005 and newer) on the DELETE, INSERT and UPDATE statement.
It can be extremely useful to know which rows have been INSERTed, UPDATEd, or DELETEd, and the OUTPUT clause allows to do this very easily -
it allows access to the "virtual" tables called inserted and deleted (like in triggers):
 
DELETE FROM (table)OUTPUT deleted.ID, deleted.DescriptionWHERE (condition)
 
If you're inserting values into a table which has an INT IDENTITY primary key field, with the OUTPUT clause, you can get the inserted new ID right away:

INSERT INTO MyTable(Field1, Field2)OUTPUT inserted.IDVALUES (Value1, Value2)
 
And if you're updating, it can be extremely useful to know what changed - in this case, inserted represents the new values (after the UPDATE), while deleted refers to the old values before the UPDATE:
 
UPDATE (table)SET field1 = value1, field2 = value2OUTPUT inserted.ID, deleted.field1, inserted.field1WHERE (condition)

If a lot of info will be returned, the output of OUTPUT can also be redirected to a temporary table or a table variable (OUTPUT INTO @myInfoTable).
 
More tips
  • Use ctrl-0 to insert a null value in a cell
  • WITH (FORCESEEK) which forces the query optimizer to use only an index seek operation as the access path to the data in the table.
    http://msdn.microsoft.com/en-us/library/bb510478.aspx
  • PIVOT and UNPIVOT
  • Save transactions
  • Select xact_state()  --Returns whether transaction pending/committed
  • xp_fixeddrives       --Returns the free disk space in the server
  • SELECT ASCII('A'),CHAR(65),CHAR(ASCII('A')),UNICODE(N'?'),NCHAR(923),NCHAR(UNICODE(N'?')) --Some Ascii/unicode features
  • select * from tablename tablesample(2 percent) --Returns the 2% of the records in the db.Alt select top 10 percent * from table
  • select cast(GETDATE()as varchar) as datetime --Returns the date time as formatted.
  • WITH TIES
    The SELECT TOP N query always return exactly N records, and arbitrarily drops any record that have the same value as the last record in the group
  • select top 5 with ties * from scores order by name desc
  • SELECT Column FROM Table ORDER BY CHECKSUM(NEWID()) -- Return rows in a random order
  • master..xp_cmdshell to run commands from a command prompt on the server
    master..xp_cmdshell 'dir d:\'   --List the Directory
    master..xp_cmdshell 'sc query "ccm6.5_ess"' --Query a service
    master..xp_cmdshell 'sc start "ccm6.5_ess"' --Start a service

 
 
 
 

Saturday, October 3, 2009

Two free security tools from Microsoft SDL team

The SDL team here at Microsoft released a couple of new tools recently to help development teams verify the security of their software before they ship. BinScope Binary Analyzer and MiniFuzz File Fuzzer are both being offered as free downloads. The team took the time to make sure that both tools work as stand-alone tools as well as integrated into Visual Studio and Team System.

BinScope is a verification tool that has been used inside Microsoft for several years to help developers and testers confirm they are building their code to use compiler/linker protections required by the SDL. BinScope allows you to scan your code to verify you are setting important security protections such as /GS, /SafeSEH, /NXCOMPAT, and /DYNAMICBASE. In addition it checks to see that you are using .NET strong-named assemblies, good ATL headers, an up-to-date compiler, and not using dangerous constructs such as global function pointers.



Both of these tools are equipped to easily integrate with Visual Studio 2008 Pro as well as Team Foundation Server 2008 and Team System 2008. By installing BinScope as integrated, it can be launched and output results within the Visual Studio IDE. MiniFuzz can be installed as an external tool add-in. Both tools have easy-to-set integration with Team Foundation Server 2008 and compliment the SDL Process Template for VSTS.

Writing secure code is becoming very important to most development teams. I am glad to see the SDL team making these types of tools available to the Visual Studio development community and making it easier to ship more secure code.

Sql Server Hidden Features

Stored Procedures

* sp_msforeachtable: Runs a command with '?' replaced with each table name (v6.5 and up)
* sp_msforeachdb: Runs a command with '?' replaced with each database name (v7 and up)
* sp_who2: just like sp_who, but with a lot more info for troubleshooting blocks (v7 and up)
* sp_helptext: If you want the code of a stored procedure
* sp_tables: return a list of all tables
* sp_stored_procedures: return a list of all stored procedures
* xp_sscanf: Reads data from the string into the argument locations specified by each format argument.
* xp_fixeddrives:: Find the fixed drive with largest free space
* sp_help: If you want to know the table structure, indexes and constraints of a table
Functions

* HashBytes()
* EncryptByKey
* PIVOT command

TableDiff.exe

* Table Difference tool allows you to discover and reconcile differences between a source and destination table or a view. Tablediff Utility can report differences on schema and data. The most popular feature of tablediff is the fact that it can generate a script that you can run on the destination that will reconcile differences between the tables.
See More

Return rows in a random order
SELECT
SomeColumn
FROM
SomeTable
ORDER BY
CHECKSUM(NEWID())

In Management Studio, you can put a number after a GO end-of-batch marker to cause the batch to be repeated that number of times:

PRINT 'X'
GO 10

Will print 'X' 10 times. This can save you from tedious copy/pasting when doing repetitive stuff

Identity Coloumn Insert

Last week i need a scenario to insert values into a table having an identity coloumn only.

eg: create table test(id int identity)

Is it possible to insert values into the without setting identity insert off/on.I find out a method as follows.

insert into test() default values
go 20

This will insert 20 records to the table haaa...
Will update later..

Google Wave: the sky is falling!!!

Google Wave is a product that helps users communicate and collaborate on the web. A "wave" is equal parts conversation and document, where users can almost instantly communicate and work together with richly formatted text, photos, videos, maps, and more. Google Wave is also a platform with a rich set of open APIs that allow developers to embed waves in other web services and to build extensions that work inside waves.

A wave is equal parts conversation and document. People can communicate and work together with richly formatted text, photos, videos, maps, and more.

A wave is shared. Any participant can reply anywhere in the message, edit the content and add participants at any point in the process. Then playback lets anyone rewind the wave to see who said what and when.

A wave is live. With live transmission as you type, participants on a wave can have faster conversations, see edits and interact with extensions in real-time.

So is Wave going to threaten RIA platforms? I don’t know. Is it even an RIA platform? I just think that all the messages about how Wave is pushing out things like Flash, Silverlight or JavaFX are unfounded at this point. They all serve purposes.I still requested for an invitation to join waves,but not yet received from google.One who got invites can invite 8 more people,if then so please invite me to mail@renjucool.com
http://wave.google.com

Thursday, August 20, 2009

Sony announces PS3 Slim for $299

Sony finally announced the much rumored PlayStation 3 Slim gaming console at the on-going GamesCom 2009 Expo in Cologne, Germany. Back in May, images of Sony PS3 Slim chasis and retail package were leaked. Sony Computer Entertainment CEO Kaz Hirai unveiled new PlayStation 3 Slim model at the GamesCon 2009 Expo.
Sony will offer PS3 Slim in North America and Europe based stores from Sept. 1 for $299 (Rs. 14,400 approx.). The Sony PlayStation 3 Slim CECH-2000A features 120GB HDD and will come with PS3 firmware 3.0 version. Sony said it currently has no plans to bring PS3 Slim in India. Interestingly, PS3News folks dug up some information about another PS3 Slim CECH-2000B model with 250GB HDD from FCC site.


Fat and huge looking Sony PS3 will shed weight and size in new PlayStation 3 Slim version that will pack 120GB storage. Sony has redesigned the internal design architecture of PS3 Slim - from main semiconductors to power supply unit. Hirai, at the launch, said that PS3 Slim is 33 percent smaller and 36 percent lighter than the previous PS3 models.

PS3 Slim has new 45nm process manufactured Cell processor that runs at the same clock speed as the old 60nm processor. Obviously this smaller processor would be more power efficient and power consumption can be brought down by two-thirds which will reduce fan noise. Sony has designed PS3 Slim to be used horizontally (as PS2 is used) unlike the earlier PS3 models. However, Sony offers a $24 (Rs. 1,200 approx.) stand to hold the PS3 Slim vertically.

Along with that, Sony has also slashed prices of the existing PS3 models by $100 (Rs. 4,800 approx.). So now, the 80GB HDD bearing PS3 will cost $299 (Rs. 14,400 approx.) and the 160GB HDD bearing PS3 will cost $399 (Rs. 19,200 approx.).

The PlayStation 3 80 GB prices for India have been slashed by 20 percent and now the console costs Rs. 19,990. Along with that, Sony India will also offer two Free Games: Uncharted: Drake's Fortune and GT5 Prologue.

Saturday, August 15, 2009

Windows Azure Services Platform

Windows® Azure is a cloud services operating system that serves as the development, service hosting and service management environment for the Windows Azure Platform. Windows Azure provides developers with on-demand compute and storage to host, scale, and manage Web applications on the Internet through Microsoft® data centers.

Windows Azure is currently in Community Technology Preview. Commercial availability for Windows Azure will likely be at the end of calendar year 2009.

Windows Azure is an open platform that will support both Microsoft and non-Microsoft languages and environments. To build applications and services on Windows Azure, developers can use their existing Microsoft® Visual Studio® 2008 expertise. In addition, Windows Azure supports popular standards and protocols including SOAP, REST, XML, and PHP.

We can use azure service for
* Add Web service capabilities to existing packaged applications
* Build, modify, and distribute applications to the Web with minimal on-premises resources
* Perform services (large-volume storage, batch processing, intense or large- volume computations, etc.) off premises
* Create, test, debug, and distribute Web services quickly and inexpensively
* Reduce costs of building and extending on-premises resources
* Reduce the effort and costs of IT management

On August 11,2009 we are releasing a new set of features for Windows Azure Blob. Windows Azure Blob enables applications to store and manipulate large objects and files in the cloud. The blobs (files) can be up to 50GB in size for the CTP.

All changes for this release are versioned changes, using “x-ms-version: 2009-07-17”. All prior versions of commands executed against the storage system will continue to work, as we extend the capabilities of the existing commands and introduce new commands.
http://www.microsoft.com/azure/netservices.mspx

Silverlight and Deepzoom

Silverlight Deep Zoom provides the ability to view high resolution images rapidly without affecting the performance of your applications.
This page from silverlight.net has some quick start tutorials for Silverlight Deep Zoom.
http://memorabilia.hardrock.com/ has the best example for Silverlight Deep Zoom implementation.

Above page has hundreds of high resolution images of Rock Bands. Below are screenshots of Silverlight Deep Zoom in action.



I dont need to tell you who the guys are ;-)

A little bit of zoom out!!!


Some more…


more….


Some more…


Can you find the guys in the above picture? Let us zoom out some more..

more…


some more???


hmmm!! some more??


This is the beauty of Silverlight Deep Zoom. So next time when you send your large collection og your last picnic photos you dont have to send large attachments. Create a simple Silverlight site with Deep Zoom and send it with Style.

Sunday, July 19, 2009

Data Recovery/Live Software-ERD Commander

RD Commander is a very useful tool that Microsoft acquired with it’s purchase of Winternals. It’s especially useful for computers that aren’t able to boot into Windows, or even safe mode. ERD (Emergency Repair Disk) allows access to Windows restore points, file recovery, crash analysis, hotfix uninstall, and other low level operating system tasks, all in a very familiar Windows interface. It also provides network and internet access, as well as a web browser.


The full version is intended for IT professionals, and is available only as part of the Microsoft Desktop Optimization Pack for Software Assurance customers. However, you can download a free-trial as part of the Microsoft Diagnostics and Recovery Toolset. After downloading this toolset, install it to find an ISO file containing ERD Commander:

C:\Program Files\Microsoft Diagnostics and Recovery Toolset\erd50.iso

An ISO is a special file that when burned to a CD can create a bootable CD. Windows doesn’t support ISO burning without third-party software. ISO Recorder is a very simple, and free program for creating CDs from ISOs. After you’ve burned the CD from the ISO, simply boot the system from the CD you just created to start ERD Commander. If your system doesn’t boot from the CD, you might have to change the BIOS boot settings (boot order).

A couple of caveats. The trial version of ERD Commander is ERD Commander 2005. The latest version (currently ERD 6.0) is only shipping with the Desktop Optimization Pack. While version 6.0 is Vista compatible, 2005 is not (compatible with Windows NT, 2000, XP, and Server 2003). The trial period is 30 days. Next time you’re faced with an unbootable Windows XP system, give ERD Commander a try.

Saturday, May 2, 2009

Converting Text to RSS Feeds


Today i used created a good method to parse a raw text document to rss feeds using Generic Handlers.Please refer the code below.
<%@ WebHandler Language="C#" Class="Cricket" %>

using System;
using System.Web;
using System.Net;
using System.Text;
using System.Linq;
public class Cricket : IHttpHandler {

public void ProcessRequest(HttpContext context)
{
StringBuilder sb=new StringBuilder();
context.Response.ContentType = "text/xml";
sb.Append(@"


Score
http://renjucool.co.nr
Get latest score
(c) 2008 renjucool.co.nr

");
sb.Append("");
sb.Append("Score of "+DateTime.Now.ToShortDateString()+"");
sb.Append("Score of " + GetScore() + "");
sb.Append("Score of " + GetScore() + "");
sb.Append("Score of " + DateTime.Now.ToString("ddd, MMM yyyy hh:mm:ss tt") + "");
sb.Append("
");
sb.Append(@"


");
context.Response.Write(sb.ToString());
}
public string GetScore()
{
string ret = "";
WebClient myClient = new WebClient();
String data=Encoding.ASCII.GetString(myClient.DownloadData
("http://renju.sparkonnet.com/files/raw.tx"));

foreach (string s in (from s in data.Split((char)10) select GetValue(s)))
{
ret += " "+s;
}
return ret;
}
public string GetValue(string inp)
{
if (inp.IndexOf("=") == -1) return inp;
return inp.Substring(inp.IndexOf("=") + 1);
}

public bool IsReusable {
get {
return false;
}
}

}
Please refer the screen shot as the code contains html tags

Thursday, April 23, 2009

v$session for oracle and sql server

select * from sys.sysprocesses

can be used for sql server instead of v$session in oracle

If you work with SQL Server or Oracle you at some point probably will need to determine your unique connection identifier. For SQL Server, the connection identifier is called a Server Process ID or SPID. It can be obtained from the global @@SPID as:
SELECT @@SPID
Once the SPID is determined, it can be used to query various system tables (e.g., sysprocesses, sysobjects, etc.) to obtain process information.
For Oracle, the connection identifier is called a Session ID or SID. To get the Session ID, the V$SESSION view must be queried as:

SELECT sid from v$session where audsid = userenv('sessionid');
When the SID is determined, it can be used to find out session information using other Oracle provided system views such as V$SQL.

Disk I/O:The Performance Bottle neck

Many people think of "performance tuning" as optimizing loops, algorithms, and memory use. In truth, however, you don't get the huge performance gains from optimizing CPU and memory use (which is good), but from eliminating I/O calls.

Disk I/O is responsible for almost all slow websites and desktop applications. It's true. Watch your CPU use next time you open a program, or your server is under load. CPUs aren't the bottleneck anymore - your hard drive is. At the hardware level, the hard drive is the slowest component by an incredibly large factor. Today's memory ranges between 3200 and 10400 MB/s. In contrast, today's desktop hard drive speeds average about 50 MB/s (Seagate 500GB), with high-end drives getting 85MB/s (WD 640, Seagate 1TB). If you're looking at bandwidth, hard drives are 200-300 times slower. Bandwidth, though, isn't the killer - it's the latency. Few modern hard drives have latencies under 13 milliseconds - while memory latency is usually about 5 nanoseconds - 2,000 times faster.

You're probably looking at these numbers and thinking, "13ms is quite fast enough for me, and my app is only dealing with small files". However, I have a question: what other applications are using that drive? If you're on a shared server, the odds are high that between 25 and 2500 ASP.NET apps are being run on the same drive.

CPU, bandwidth, and memory throttling is becoming more and more common on shared servers and virtualization systems, but practical disk throttling isn't even on the horizon from what I can tell. Improper I/O usage from any app affects everybody.

Since hard drives are slow, pending operations go into a queue. So even if your app only needs a single byte of data from the hard drive, it still has to wait its turn. It's quite common for disk operations to take several seconds on a shared server under heavy load. If any application on the server is paging to disk from exessive memory use, it can take several minutes, causing a timeout.

Realistic I/O performance is really hard to simulate in a development environment. On a dedicated development machine, disk queues are short, and response times are usually near the optimal 13ms, which tends to give software developers gravely incrorrect ideas about the performance characteristics of their application.

This is one of my favourite article i read,refer Click to read more

Monday, April 6, 2009

String Searching in a database

This is the query which searches for string in the entire DB.
Provide 2 inputs as DB name and Search string.
This query takes more time, but will sure get you the results.
You can customize the search criteria…


----------------------------------------------------------
Use [Database_Name]
go
declare @SearchChar varchar(8000)
Set @SearchChar = 'Search_Text' -- Like 'renju%', '11/11/2006'
declare @CMDMain varchar(8000), @CMDMainCount varchar(8000),@CMDJoin varchar(8000)
declare @ColumnName varchar(100),@TableName varchar(100)
declare dbTable cursor for
SELECT
Distinct b.Name as TableName
FROM sysobjects b
WHERE
b.type='u' and b.Name <> 'dtproperties' order by b.name

open dbTable
fetch next from dbTable into @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
declare db cursor for
SELECT c.Name as ColumnName
FROM sysobjects b,syscolumns c
WHERE C.id = b.id and
b.type='u' and b.Name = @TableName
order by b.name

open db
fetch next from db into @ColumnName
set @CMDMain = 'SELECT ' + char(39) + @TableName + char(39) + ' as TableName,'+
' ['+ @TableName + '].* FROM [' + @TableName + ']'+
' WHERE '
set @CMDMainCount = 'SELECT Count(*) FROM [' + @TableName + '] Where '
Set @CMDJoin = ''

WHILE @@FETCH_STATUS = 0
BEGIN
set @CMDJoin = @CMDJoin + 'Convert(varchar(5000),[' +@ColumnName + ']) like ' + char(39) + @SearchChar + char(39) + ' OR '
fetch next from db into @ColumnName
end
close db
deallocate db
Set @CMDMainCount = 'If ('+ @CMDMainCount + Left(@CMDJoin, len(@CMDJoin) - 3)+ ') > 0 Begin '
Set @CMDMain = @CMDMainCount + @CMDMain + Left(@CMDJoin, len(@CMDJoin) - 3)
Set @CMDMain = @CMDMain + ' End '
Print @CMDMain
exec (@CMDMain)
fetch next from dbTable into @TableName
end
close dbTable
deallocate dbTable


Renju

Saturday, March 7, 2009

Top Ten IIS Performance Optimization

1) Enable HTTP Keep-Alive
This setting is enabled by default. Set a side that this could improve client connection experience, this must be enabled in order for integrated authentication or all connection based authentication to work.

2) Adjust Connection Timeouts
Right on the spot, as highlighted by the author. You may want to adjust this according to your needs. To me, I felt 120 seconds is way to long for a connection timeout. Typically, I set for 30 to 45 seconds, if "I', the IIS server take longer than that to response to client or waiting for data, I would just drop it. No point holding the resource for more than 30 seconds :) Obviously, you need to evaluate your environment to derive the correct timeout value.

3) Enable HTTP Compression
Yes, some prefer third party tool like httpzip or xcompress. While if you like me. I will stick with the built-in compression feature. Refer this kb to add in more document types for compression:
Click How to specify additional document types for HTTP compression

4) Grow a Web Garden
This can be tricky, although this will boots up your application response time, do take note that if you are using any session variables make sure you have out-of-process session management in place, otherwise - since all worker processes inside the web garden do not share the memory space, having an in-process session management will cause session detail lost if the request was previously handle by other worker process. The following KB explains about the session management for ASP.NET.
Click
INFO: ASP.NET State Management Overview

5) Adjust the IIS Object Cache TTL
Great suggestion, the default value is too fast. Typically, I would go for 60 seconds at least. Unless, your site is pure dynamic and content must not be cached. In related to this Object TTL, You would also have the UserToken TTL, by default this is 15mins, which I think suitable for most of the setup. If you have a pretty dynamic user accounts management. E.g. temp account created on the fly and only valid for a short period, you might want to shorter the value. With IIS 6, setting it to '0' will disable the UserToken cache. More info, refer
Click
Changing the Default Interval for User Tokens in IIS

6) Recycle
Specifically, apply to IIS 6 only. While most users will stick the default, I on the other hand will disable all recycling event coz I think if an application is well-written and tested. It should not behavior weird and causing issue :) Well, this is just my wish, in real life - especially those IIS that you manage? For sure haunting you from time to time. I remembered in the past where developer always complaint about “something wrong with your IIS configuration’, while you, the system admin on the other hand keep fighting back saying that you are innocent, it is the code that causing this. Until, you find way to prove that it has nothing to do with IIS and the problem is with the application, your boss would typically go with developer :) Anyway - back to the topic. If you are seeing problem with your application, you should use any of the recycle events to keep up the application availability until you figure out what's wrong. Also, always checked 'shutdown worker process after being idle for' for X minutes. Recycling is good, not only it refreshes your application (sometime I don't agreed with this view), but it also returns the unused resource by to the system pool. Oh ya! What do you do with IIS 5? Well, you can try the IIS5 process recycling tool, refer
Click
How to recycle IIS 5.0 with the IIS 5.0 Process Recycling tool

7) Limit Queue Length
I'm happy with the default limit, there's typo in the article. Default is 4000 not 1000 :) And of coz at any time - you should not see lot of queue requests, if you do see that, meaning you are either experiencing hardware bottleneck or something really wrong with your application.

8) Shift Priority to the Working Set
I am lazy :) I never really change this setting unless is for SQL server. When you running on a low power box with less cpu power and memory - you should look into this.

9) Add Memory
Mm.... to my standard - I have lot of budget - my web server are typically equipped with 2GB memory and this is good enough for most of the web application out there. Of coz, depending on the number of users and application nature - you may need more ram or even setup network load balancing. In the past, I have been seeing to boots up web server you need more CPU processing power, while database server needs tons of memory. That's just what I have experienced, anyway - for you to make the right call - always do a performance monitoring to determine if memory is a bottleneck.

10) Use Disk Striping
This is like a bonus. In my opinion, since you have no control for the IIS binaries reside on the system partition, you should have maximum read output for your website pages, and maximum write performance for the log files. Hence, you should have a mirrorset for web pages, and disk striping without parity for the log files. Also, it always the best practice to have the above (IIS binaries, Web pages, IIS log files) in three different partitions/drive and secure it properly with NTFS permissions.