Thursday, August 20, 2009

Sony announces PS3 Slim for $299

Sony finally announced the much rumored PlayStation 3 Slim gaming console at the on-going GamesCom 2009 Expo in Cologne, Germany. Back in May, images of Sony PS3 Slim chasis and retail package were leaked. Sony Computer Entertainment CEO Kaz Hirai unveiled new PlayStation 3 Slim model at the GamesCon 2009 Expo.
Sony will offer PS3 Slim in North America and Europe based stores from Sept. 1 for $299 (Rs. 14,400 approx.). The Sony PlayStation 3 Slim CECH-2000A features 120GB HDD and will come with PS3 firmware 3.0 version. Sony said it currently has no plans to bring PS3 Slim in India. Interestingly, PS3News folks dug up some information about another PS3 Slim CECH-2000B model with 250GB HDD from FCC site.


Fat and huge looking Sony PS3 will shed weight and size in new PlayStation 3 Slim version that will pack 120GB storage. Sony has redesigned the internal design architecture of PS3 Slim - from main semiconductors to power supply unit. Hirai, at the launch, said that PS3 Slim is 33 percent smaller and 36 percent lighter than the previous PS3 models.

PS3 Slim has new 45nm process manufactured Cell processor that runs at the same clock speed as the old 60nm processor. Obviously this smaller processor would be more power efficient and power consumption can be brought down by two-thirds which will reduce fan noise. Sony has designed PS3 Slim to be used horizontally (as PS2 is used) unlike the earlier PS3 models. However, Sony offers a $24 (Rs. 1,200 approx.) stand to hold the PS3 Slim vertically.

Along with that, Sony has also slashed prices of the existing PS3 models by $100 (Rs. 4,800 approx.). So now, the 80GB HDD bearing PS3 will cost $299 (Rs. 14,400 approx.) and the 160GB HDD bearing PS3 will cost $399 (Rs. 19,200 approx.).

The PlayStation 3 80 GB prices for India have been slashed by 20 percent and now the console costs Rs. 19,990. Along with that, Sony India will also offer two Free Games: Uncharted: Drake's Fortune and GT5 Prologue.

Saturday, August 15, 2009

Windows Azure Services Platform

Windows® Azure is a cloud services operating system that serves as the development, service hosting and service management environment for the Windows Azure Platform. Windows Azure provides developers with on-demand compute and storage to host, scale, and manage Web applications on the Internet through Microsoft® data centers.

Windows Azure is currently in Community Technology Preview. Commercial availability for Windows Azure will likely be at the end of calendar year 2009.

Windows Azure is an open platform that will support both Microsoft and non-Microsoft languages and environments. To build applications and services on Windows Azure, developers can use their existing Microsoft® Visual Studio® 2008 expertise. In addition, Windows Azure supports popular standards and protocols including SOAP, REST, XML, and PHP.

We can use azure service for
* Add Web service capabilities to existing packaged applications
* Build, modify, and distribute applications to the Web with minimal on-premises resources
* Perform services (large-volume storage, batch processing, intense or large- volume computations, etc.) off premises
* Create, test, debug, and distribute Web services quickly and inexpensively
* Reduce costs of building and extending on-premises resources
* Reduce the effort and costs of IT management

On August 11,2009 we are releasing a new set of features for Windows Azure Blob. Windows Azure Blob enables applications to store and manipulate large objects and files in the cloud. The blobs (files) can be up to 50GB in size for the CTP.

All changes for this release are versioned changes, using “x-ms-version: 2009-07-17”. All prior versions of commands executed against the storage system will continue to work, as we extend the capabilities of the existing commands and introduce new commands.
http://www.microsoft.com/azure/netservices.mspx

Silverlight and Deepzoom

Silverlight Deep Zoom provides the ability to view high resolution images rapidly without affecting the performance of your applications.
This page from silverlight.net has some quick start tutorials for Silverlight Deep Zoom.
http://memorabilia.hardrock.com/ has the best example for Silverlight Deep Zoom implementation.

Above page has hundreds of high resolution images of Rock Bands. Below are screenshots of Silverlight Deep Zoom in action.



I dont need to tell you who the guys are ;-)

A little bit of zoom out!!!


Some more…


more….


Some more…


Can you find the guys in the above picture? Let us zoom out some more..

more…


some more???


hmmm!! some more??


This is the beauty of Silverlight Deep Zoom. So next time when you send your large collection og your last picnic photos you dont have to send large attachments. Create a simple Silverlight site with Deep Zoom and send it with Style.

Sunday, July 19, 2009

Data Recovery/Live Software-ERD Commander

RD Commander is a very useful tool that Microsoft acquired with it’s purchase of Winternals. It’s especially useful for computers that aren’t able to boot into Windows, or even safe mode. ERD (Emergency Repair Disk) allows access to Windows restore points, file recovery, crash analysis, hotfix uninstall, and other low level operating system tasks, all in a very familiar Windows interface. It also provides network and internet access, as well as a web browser.


The full version is intended for IT professionals, and is available only as part of the Microsoft Desktop Optimization Pack for Software Assurance customers. However, you can download a free-trial as part of the Microsoft Diagnostics and Recovery Toolset. After downloading this toolset, install it to find an ISO file containing ERD Commander:

C:\Program Files\Microsoft Diagnostics and Recovery Toolset\erd50.iso

An ISO is a special file that when burned to a CD can create a bootable CD. Windows doesn’t support ISO burning without third-party software. ISO Recorder is a very simple, and free program for creating CDs from ISOs. After you’ve burned the CD from the ISO, simply boot the system from the CD you just created to start ERD Commander. If your system doesn’t boot from the CD, you might have to change the BIOS boot settings (boot order).

A couple of caveats. The trial version of ERD Commander is ERD Commander 2005. The latest version (currently ERD 6.0) is only shipping with the Desktop Optimization Pack. While version 6.0 is Vista compatible, 2005 is not (compatible with Windows NT, 2000, XP, and Server 2003). The trial period is 30 days. Next time you’re faced with an unbootable Windows XP system, give ERD Commander a try.

Saturday, May 2, 2009

Converting Text to RSS Feeds


Today i used created a good method to parse a raw text document to rss feeds using Generic Handlers.Please refer the code below.
<%@ WebHandler Language="C#" Class="Cricket" %>

using System;
using System.Web;
using System.Net;
using System.Text;
using System.Linq;
public class Cricket : IHttpHandler {

public void ProcessRequest(HttpContext context)
{
StringBuilder sb=new StringBuilder();
context.Response.ContentType = "text/xml";
sb.Append(@"


Score
http://renjucool.co.nr
Get latest score
(c) 2008 renjucool.co.nr

");
sb.Append("");
sb.Append("Score of "+DateTime.Now.ToShortDateString()+"");
sb.Append("Score of " + GetScore() + "");
sb.Append("Score of " + GetScore() + "");
sb.Append("Score of " + DateTime.Now.ToString("ddd, MMM yyyy hh:mm:ss tt") + "");
sb.Append("
");
sb.Append(@"


");
context.Response.Write(sb.ToString());
}
public string GetScore()
{
string ret = "";
WebClient myClient = new WebClient();
String data=Encoding.ASCII.GetString(myClient.DownloadData
("http://renju.sparkonnet.com/files/raw.tx"));

foreach (string s in (from s in data.Split((char)10) select GetValue(s)))
{
ret += " "+s;
}
return ret;
}
public string GetValue(string inp)
{
if (inp.IndexOf("=") == -1) return inp;
return inp.Substring(inp.IndexOf("=") + 1);
}

public bool IsReusable {
get {
return false;
}
}

}
Please refer the screen shot as the code contains html tags

Thursday, April 23, 2009

v$session for oracle and sql server

select * from sys.sysprocesses

can be used for sql server instead of v$session in oracle

If you work with SQL Server or Oracle you at some point probably will need to determine your unique connection identifier. For SQL Server, the connection identifier is called a Server Process ID or SPID. It can be obtained from the global @@SPID as:
SELECT @@SPID
Once the SPID is determined, it can be used to query various system tables (e.g., sysprocesses, sysobjects, etc.) to obtain process information.
For Oracle, the connection identifier is called a Session ID or SID. To get the Session ID, the V$SESSION view must be queried as:

SELECT sid from v$session where audsid = userenv('sessionid');
When the SID is determined, it can be used to find out session information using other Oracle provided system views such as V$SQL.

Disk I/O:The Performance Bottle neck

Many people think of "performance tuning" as optimizing loops, algorithms, and memory use. In truth, however, you don't get the huge performance gains from optimizing CPU and memory use (which is good), but from eliminating I/O calls.

Disk I/O is responsible for almost all slow websites and desktop applications. It's true. Watch your CPU use next time you open a program, or your server is under load. CPUs aren't the bottleneck anymore - your hard drive is. At the hardware level, the hard drive is the slowest component by an incredibly large factor. Today's memory ranges between 3200 and 10400 MB/s. In contrast, today's desktop hard drive speeds average about 50 MB/s (Seagate 500GB), with high-end drives getting 85MB/s (WD 640, Seagate 1TB). If you're looking at bandwidth, hard drives are 200-300 times slower. Bandwidth, though, isn't the killer - it's the latency. Few modern hard drives have latencies under 13 milliseconds - while memory latency is usually about 5 nanoseconds - 2,000 times faster.

You're probably looking at these numbers and thinking, "13ms is quite fast enough for me, and my app is only dealing with small files". However, I have a question: what other applications are using that drive? If you're on a shared server, the odds are high that between 25 and 2500 ASP.NET apps are being run on the same drive.

CPU, bandwidth, and memory throttling is becoming more and more common on shared servers and virtualization systems, but practical disk throttling isn't even on the horizon from what I can tell. Improper I/O usage from any app affects everybody.

Since hard drives are slow, pending operations go into a queue. So even if your app only needs a single byte of data from the hard drive, it still has to wait its turn. It's quite common for disk operations to take several seconds on a shared server under heavy load. If any application on the server is paging to disk from exessive memory use, it can take several minutes, causing a timeout.

Realistic I/O performance is really hard to simulate in a development environment. On a dedicated development machine, disk queues are short, and response times are usually near the optimal 13ms, which tends to give software developers gravely incrorrect ideas about the performance characteristics of their application.

This is one of my favourite article i read,refer Click to read more

Monday, April 6, 2009

String Searching in a database

This is the query which searches for string in the entire DB.
Provide 2 inputs as DB name and Search string.
This query takes more time, but will sure get you the results.
You can customize the search criteria…


----------------------------------------------------------
Use [Database_Name]
go
declare @SearchChar varchar(8000)
Set @SearchChar = 'Search_Text' -- Like 'renju%', '11/11/2006'
declare @CMDMain varchar(8000), @CMDMainCount varchar(8000),@CMDJoin varchar(8000)
declare @ColumnName varchar(100),@TableName varchar(100)
declare dbTable cursor for
SELECT
Distinct b.Name as TableName
FROM sysobjects b
WHERE
b.type='u' and b.Name <> 'dtproperties' order by b.name

open dbTable
fetch next from dbTable into @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
declare db cursor for
SELECT c.Name as ColumnName
FROM sysobjects b,syscolumns c
WHERE C.id = b.id and
b.type='u' and b.Name = @TableName
order by b.name

open db
fetch next from db into @ColumnName
set @CMDMain = 'SELECT ' + char(39) + @TableName + char(39) + ' as TableName,'+
' ['+ @TableName + '].* FROM [' + @TableName + ']'+
' WHERE '
set @CMDMainCount = 'SELECT Count(*) FROM [' + @TableName + '] Where '
Set @CMDJoin = ''

WHILE @@FETCH_STATUS = 0
BEGIN
set @CMDJoin = @CMDJoin + 'Convert(varchar(5000),[' +@ColumnName + ']) like ' + char(39) + @SearchChar + char(39) + ' OR '
fetch next from db into @ColumnName
end
close db
deallocate db
Set @CMDMainCount = 'If ('+ @CMDMainCount + Left(@CMDJoin, len(@CMDJoin) - 3)+ ') > 0 Begin '
Set @CMDMain = @CMDMainCount + @CMDMain + Left(@CMDJoin, len(@CMDJoin) - 3)
Set @CMDMain = @CMDMain + ' End '
Print @CMDMain
exec (@CMDMain)
fetch next from dbTable into @TableName
end
close dbTable
deallocate dbTable


Renju

Saturday, March 7, 2009

Top Ten IIS Performance Optimization

1) Enable HTTP Keep-Alive
This setting is enabled by default. Set a side that this could improve client connection experience, this must be enabled in order for integrated authentication or all connection based authentication to work.

2) Adjust Connection Timeouts
Right on the spot, as highlighted by the author. You may want to adjust this according to your needs. To me, I felt 120 seconds is way to long for a connection timeout. Typically, I set for 30 to 45 seconds, if "I', the IIS server take longer than that to response to client or waiting for data, I would just drop it. No point holding the resource for more than 30 seconds :) Obviously, you need to evaluate your environment to derive the correct timeout value.

3) Enable HTTP Compression
Yes, some prefer third party tool like httpzip or xcompress. While if you like me. I will stick with the built-in compression feature. Refer this kb to add in more document types for compression:
Click How to specify additional document types for HTTP compression

4) Grow a Web Garden
This can be tricky, although this will boots up your application response time, do take note that if you are using any session variables make sure you have out-of-process session management in place, otherwise - since all worker processes inside the web garden do not share the memory space, having an in-process session management will cause session detail lost if the request was previously handle by other worker process. The following KB explains about the session management for ASP.NET.
Click
INFO: ASP.NET State Management Overview

5) Adjust the IIS Object Cache TTL
Great suggestion, the default value is too fast. Typically, I would go for 60 seconds at least. Unless, your site is pure dynamic and content must not be cached. In related to this Object TTL, You would also have the UserToken TTL, by default this is 15mins, which I think suitable for most of the setup. If you have a pretty dynamic user accounts management. E.g. temp account created on the fly and only valid for a short period, you might want to shorter the value. With IIS 6, setting it to '0' will disable the UserToken cache. More info, refer
Click
Changing the Default Interval for User Tokens in IIS

6) Recycle
Specifically, apply to IIS 6 only. While most users will stick the default, I on the other hand will disable all recycling event coz I think if an application is well-written and tested. It should not behavior weird and causing issue :) Well, this is just my wish, in real life - especially those IIS that you manage? For sure haunting you from time to time. I remembered in the past where developer always complaint about “something wrong with your IIS configuration’, while you, the system admin on the other hand keep fighting back saying that you are innocent, it is the code that causing this. Until, you find way to prove that it has nothing to do with IIS and the problem is with the application, your boss would typically go with developer :) Anyway - back to the topic. If you are seeing problem with your application, you should use any of the recycle events to keep up the application availability until you figure out what's wrong. Also, always checked 'shutdown worker process after being idle for' for X minutes. Recycling is good, not only it refreshes your application (sometime I don't agreed with this view), but it also returns the unused resource by to the system pool. Oh ya! What do you do with IIS 5? Well, you can try the IIS5 process recycling tool, refer
Click
How to recycle IIS 5.0 with the IIS 5.0 Process Recycling tool

7) Limit Queue Length
I'm happy with the default limit, there's typo in the article. Default is 4000 not 1000 :) And of coz at any time - you should not see lot of queue requests, if you do see that, meaning you are either experiencing hardware bottleneck or something really wrong with your application.

8) Shift Priority to the Working Set
I am lazy :) I never really change this setting unless is for SQL server. When you running on a low power box with less cpu power and memory - you should look into this.

9) Add Memory
Mm.... to my standard - I have lot of budget - my web server are typically equipped with 2GB memory and this is good enough for most of the web application out there. Of coz, depending on the number of users and application nature - you may need more ram or even setup network load balancing. In the past, I have been seeing to boots up web server you need more CPU processing power, while database server needs tons of memory. That's just what I have experienced, anyway - for you to make the right call - always do a performance monitoring to determine if memory is a bottleneck.

10) Use Disk Striping
This is like a bonus. In my opinion, since you have no control for the IIS binaries reside on the system partition, you should have maximum read output for your website pages, and maximum write performance for the log files. Hence, you should have a mirrorset for web pages, and disk striping without parity for the log files. Also, it always the best practice to have the above (IIS binaries, Web pages, IIS log files) in three different partitions/drive and secure it properly with NTFS permissions.

Saturday, February 28, 2009

Enabling HTTP Compression in IIS 6.0

HTTP Compression is when a Web server receives a request for a file and instead of just serving the file to the client, it checks to see if the client browser (or application) is "Compression Enabled". If so, the Web server does a check on what type of file is being requested (this is determined by the file’s extension). If the file is marked as a static file, such as an HTML file, IIS will check the Compression Cache Temporary Directory.

Tip: To find the Compression Cache Temporary Directory, open up IIS and right-click on the Web Sites node and go to the Service tab. There is a text box that has a label next to it marked Temporary Directory, although it may not yet be enabled.

If a compressed version isn’t found, IIS will send an uncompressed version of the file to the client and a compressed version is placed in the temporary directory (IIS will only serve to the client from the temp directory). If the compressed version is found, IIS will send the file directly to the requesting client. If the requested file is a dynamic file, such as an ASP.NET Web form, then the response is dynamically compressed and sent to the requesting client (no temp directory access is ever done).

Enabling HTTP Compression on your Windows 2003 Server

There are quite a few steps to enabling HTTP Compression on your server. If you follow the steps in this article, you shouldn’t have any issues.

First, open up IIS and right-click on the Web Sites node and go to Properties. Click on the Service tab. As shown in FIGURE 1, you’ll see two options: Isolation mode and HTTP compression. If the Run WWW service in IIS 5.0 isolation mode check box is checked, IIS will run almost exactly like IIS 5.0. This means you won’t be able to take advantage of things such as Application Pools, which in my opinion are worth the upgrade to IIS 6.0 by themselves.



FIGURE 1: The Web Sites Properties dialog box

We’ll utilize the options within HTTP compression in this article:

Compress application files. Check this to compress application files. If you do select this, you must also have Compress static files checked, although you won't be warned of this need.
Compress static files. Check this to compress static files. After you do so, the Temporary directory text box is active.
Temporary directory. You can leave this at the default, which is %windir%\IIS Temporary Compressed Files, or set it to a custom folder. This is where temporary compressed static files will be stored.
Maximum temporary directory size. This option enables you to set the maximum size of the temporary directory. After the size is met, items are removed based on duration; the older files are removed and the new files are put in.
Next, go to the Web Service Extensions node. Right-click in the right pane, and click Add a new Web service extension. The New Web Service Extension dialog box appears, as shown in FIGURE 2. You can enter any name for the extension, but what others, including myself, recommend is HTTP Compression.



FIGURE 2: The Web Service Extension dialog box

Click on Add. Choose C:\WINDOWS\system32\inetsrv\gzip.dll (your path may be different, but that is doubtful), and click OK. Check the Set extension status to Allowed check box, and click OK.

IIS 6.0 Metabase Configuration - MetaBase.xml
Open up Windows Explorer and go to C:\Windows\System32\inetsrv. Find MetaBase.xml and make a copy (you can just highlight it and do a Ctrl-C, then a Ctrl-P to make a copy of MetaBase.xml). Now open up MetaBase.xml in a text editor. Find the section. Be careful, there are two sections here: one for deflate and one for gzip. We want gzip so the Location attribute of the element will have the following value:

Location ="/LM/W3SVC/Filters/Compression/gzip"

Look for the HcScriptFileExtensions section. Your default should have: asp, dll, and exe. This is where you add any extensions you want to compress for dynamic files. In my case, I added aspx.

You’ll notice many other attributes in FIGURE 3. These are the ones I find most important:

HcDoDynamicCompression. Specifies whether dynamic content should be compressed. This is important because dynamic content is by definition always changing, and IIS does not cache compressed versions of dynamic output. Thus, if dynamic compression is enabled, each request for dynamic content causes the content to be compressed. Dynamic compression consumes considerable CPU time and memory resources, and should only be used on servers that have slow network connections, but CPU time to spare.
HcDoStaticCompression. Specifies whether static content should be compressed.
HcDoOnDemandCompression: Specifies whether static files, such as .htm and .txt files, are compressed if a compressed version of the file does not exist. If set to True and a file doesn't exist, the user will be sent an uncompressed file while a background thread creates a compressed version for the next request.
HcDynamicCompressionLevel. VAL(1-10) specifies the compression level for the compression scheme, when the scheme is compressing dynamic content. Low compression levels produce slightly larger compressed files, but with lower overall impact on CPU and memory resources. Higher compression levels generally result in smaller compressed files, but with higher CPU and memory usage.
HcFileExtensions. Indicates which file name extensions are supported by the compression scheme. Only static files with the specified file extensions are compressed by IIS. If this setting is empty, no static files are compressed.
HcScriptFileExtensions. Indicates which file name extensions are supported by the compression scheme. The output from dynamic files with the file extensions specified in this property are compressed by IIS.




FIGURE 3: Essential attributes

For this example, just add aspx to the HcScriptFileExtensions section as I did, but don’t try to save. It won’t work because the file is locked by default if IIS is running.

Tip: To change this default behavior, open IIS and right-click on the top node, Internet Information Services, and check Enable Direct Metabase Edit.

Warning: The help documents state that you should use a space delimited list for the file extensions, and I have found this to be incorrect. Instead, use new lines and tabs like the following:

HcCompressionDll="%windir%\system32\inetsrv\gzip.dll"
HcCreateFlags="1"
HcDoDynamicCompression="TRUE"
HcDoOnDemandCompression="TRUE"
HcDoStaticCompression="TRUE"
HcDynamicCompressionLevel="10"
HcFileExtensions="htm
html
txt"
HcOnDemandCompLevel="10"
HcPriority="1"
HcScriptFileExtensions="asp
dll
exe
aspx">


The final step is to do an IIS shutdown and restart by right-clicking in Internet Information Services node and then click All Tasks, Restart IIS.