Jump to content

Ms Sql Server Technology Discussions


9Pardhu

Recommended Posts

[quote name='SonyVaio' timestamp='1360181916' post='1303236440']

bhaya, ardam kaledu
ipudu nenu idi ela solve cheyali chepavaa?

2 columns ki nenu datatype timestamp set cheyali endukantay naku ichina condition adi

okavela 2 columns ki datatype timestamp set chesukolem antay
[b]oka daniki time stamp set chesi inkoka daniki datetime datatypeset cheyocha?[/b]
[/quote]
ya ivvachu adey best option kuda sothat nextime if you want to pull a record base on latest timestamp for that Fieldthen, you can use datepart functionality for pulling out the Time required. So you can use it

Link to comment
Share on other sites

[quote name='DARLING...' timestamp='1360183124' post='1303236560']

date time icheyyi, you cannot give two time stamp datatypes
[/quote]

ok bro, will proceed with 1timestamp and 1datetime datatypes :)

Link to comment
Share on other sites

--Author: <Author,,Name>
-- Create date: <Create Date,,>
-- Description: <Description,,>

ikkada general ga naa peru, ivati date, kinda jarigay query gurinchi description rastay saripotadaa?

Link to comment
Share on other sites

[quote name='SonyVaio' timestamp='1360183980' post='1303236643']
--Author: <Author,,Name>
-- Create date: <Create Date,,>
-- Description: <Description,,>

ikkada general ga naa peru, ivati date, kinda jarigay query gurinchi description rastay saripotadaa?
[/quote]
dats it. next choosey vaadiki emi telika poyinaa nuvvu raasindi choosi ardam cheskoovali ala raayi description.

Link to comment
Share on other sites

[quote name='chelsea' timestamp='1360184421' post='1303236683']
bhayya ssis lo meru delta full cheyyataniki differnent methods emi use chestharu andaru
[/quote]
incremental Load aah? nuvvu adigedi

Link to comment
Share on other sites

[quote name='DARLING...' timestamp='1360187103' post='1303236925']
dats it. next choosey vaadiki emi telika poyinaa nuvvu raasindi choosi ardam cheskoovali ala raayi description.
[/quote]

ok bhaya, thanks

Link to comment
Share on other sites

[quote name='chelsea' timestamp='1360184421' post='1303236683']
bhayya ssis lo meru delta full cheyyataniki differnent methods emi use chestharu andaru
[/quote]
[quote name='DARLING...' timestamp='1360187124' post='1303236929']
incremental Load aah? nuvvu adigedi
[/quote]same doubt

Link to comment
Share on other sites

[b] Exploring the Varbinary Type[/b]

[b] Introduction[/b]

It occurred to me recently that I had never actually used a binary or [url="http://technet.microsoft.com/en-us/library/ms188362.aspx"]varbinary[/url] field in SQL Server for anything. I suspect that part of the reason for this is that there are few occasions where using a varbinary field is the best approach. In most cases, I think binary data is best stored outside of the database, with the database holding a simple file location. Of course, in SQL Server 2008 and later it is possible to use the [url="http://technet.microsoft.com/en-us/library/bb933993%28v=sql.100%29.aspx"]FileStream[/url] option to store varbinary(max) data in the filesystem.

Since I have never really used it and I want to have a thorough understanding of SQL Server, I decided to create some scripts to work with it and just see how varbinary data is used. Varbinary data needs to handled differently than some other types of data to use if effectively. Most other data types used by SQL Server, like int or varchar, are human readable, where varbinary often needs to go through some form of conversion to be used.

In this article I take a very brief look at the ways that varbinary can be used, and I include a couple of examples to show how the data can inserted, read, and converted into a usable format. This article is geared towards beginners, or at least people who have limited experience with varbinary even if they have a lot of experience with SQL. The examples in Section III are mostly independent of each other and you can easily look at just the one for the language you prefer. The examples are also somewhat artificial since they were entirely made up to show a fairly simple usage.
[b] Some Background[/b]

The varbinary(max) data type can store up to 2GB of arbitrary binary data. Since it can hold arbitrary binary data, it can be extremely versatile with the correct conversion routines. One common use is to replace the now deprecated [url="http://msdn.microsoft.com/en-us/library/ms187993.aspx"]Image[/url] datatype. It can also store complex datatypes that might come up in object oriented programming, in a relational database in a simple and straightforward way. Similarly, it can directly hold the contents of virtually any file. It also plays a significant role if you want to encrypt the data in a column in SQL Server, as K. Brian Kelley explained in his [url="http://www.mssqltips.com/sqlservertip/2840/sql-server-encryption-to-block-dbas-data-access/"]tip[/url].

However, as seen in the examples below, most data needs to go through some conversion before it can inserted into a varbinary column or before it can be used after retrieval from varbinary. SQL Server will implicitly convert from some datatypes such as int and bigint when inserting the data into a varbinary column, but varchar data needs to be explicitly converted and depending on the libraries being used applications which are trying to work with a varbinary data column may need to do additional conversions.
[b] A Few Quick Examples[/b]

Before moving on to the actual examples, we will need a testable. Here we will use a very simple version:
[CODE]create table blobTest (
id int identity primary key,
dataDesc varchar(200),
binData varbinary(max)
)[/CODE]
All of the following examples use this table. They were all designed to be fairly simple, but meant to show ways to get different types of data into and out of a varbinary column in SQL.
[b] A C# Byte Array (VS 2008)[/b]

Since C# is one of the more common languages used in conjunction with SQL Server, it seemed like a natural place to start. This example is almost trivially simple in that it starts with just string data and inserts it into a varbinary column. But it does give a basic example how to [url="http://www.dotnetperls.com/convert-string-byte-array"]encode[/url] data into a byte array which can then be inserted into a varbinary column and how to decode it afterwards. Anything which can be encoded into a byte array can be stored in a varbinary column, including data which has gone through client side encryption or complex objects which can be [url="http://msdn.microsoft.com/en-us/library/system.runtime.serialization.formatters.binary.binaryformatter.aspx"]serialized[/url] to a byte array.
[CODE]using System;
using System.Text;
using System.Data.Sql;
using System.Data.SqlClient;

namespace Csharp_blob_test
{
class Program
{
static void Main(string[] args)
{
using (SqlConnection conn = new SqlConnection("YourConnectionString"))
{
conn.Open();
byte[] testArray = Encoding.ASCII.GetBytes("We will store this string as varbinary");

string sqlCom = "insert into blobTest (dataDesc, binData) values (@dataDesc, @binData)";
SqlCommand cmd = new SqlCommand(sqlCom, conn);
cmd.Parameters.Add("@dataDesc", System.Data.SqlDbType.VarChar);
cmd.Parameters["@dataDesc"].Value = "Array From VS2008";
cmd.Parameters.Add("@binData", System.Data.SqlDbType.VarBinary);
cmd.Parameters["@binData"].Value = testArray;
cmd.ExecuteNonQuery();

//Now read it back in to verify
sqlCom = "select binData from blobTest where dataDesc = 'Array From VS2008'";
cmd = new SqlCommand(sqlCom, conn);
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
//This one will show System.Byte[]
Console.WriteLine("Raw Value = {0}", reader[0]);
//Explicitly case to byte[] and convert back to a string.
Console.WriteLine("Converted Value = {0}", ASCIIEncoding.ASCII.GetString((byte[])reader[0]));
}

reader.Close();

}
}
}
}[/CODE]

[b] A user supplied file from PowerShell[/b]

[url="http://www.simple-talk.com/sql/database-administration/im-a-sql-server-dba,-and-im-in-love-with-powershell/"]PowerShell[/url] is another tool that makes interacting with SQL server relatively straightforward. It has access to .Net libraries including SMO and makes certain types of scripting simple and straightforward. Loading data through PowerShell is similar to doing it with C# or other .Net languages, and this example shows how to load an arbitrary file from the filesystem. With a little modification it could load a directory of files with appropriate metadata into SQL Server. For instance, with a little tweaking, something like this could load numerous images for supply to other programs.
[CODE]#Connect to the database
$connStr = "YourConnStr"
$conn = new-object System.Data.SqlClient.SqlConnection($connStr)
$conn.Open()

#Load a file
$filename = Read-Host 'What file should be loaded?'
$desc = Read-Host 'What description should it have?'
[Byte[]]$binData = get-content -Encoding Byte $filename

#prepare
$cmd = new-object System.Data.SqlClient.SqlCommand("Insert into blobTest (dataDesc, binData) values(@desc, @bin)", $conn)
$cmd.CommandType = [System.Data.CommandType]'Text'
$cmd.Parameters.Add("@desc", [System.Data.SqlDbType]'VarChar')
$cmd.Parameters["@desc"].Value = $desc
$cmd.Parameters.Add("@bin", [System.Data.SqlDbType]'VarBinary')
$cmd.Parameters["@bin"].Value = $binData

#execute
$cmd.ExecuteNonQuery()
$conn.Close()[/CODE]
Michael Otey [url="http://www.sqlmag.com/content/content/92995/Listing_01.txt"]demonstrated[/url] a somewhat similar technique using openrowset to insert data from files into a varbinary column in an older column.
[b] Pickle and arbitrary objects in Python 2.7[/b]

[url="http://www.simple-talk.com/sql/database-administration/python-for-the-sql-server-dba/"]Python[/url] is a language I use frequently with SQL Server and can be used in an object oriented fashion. It can be useful to be able to store complete objects with minimal translation in the database where they can be quickly retrieved in an organized fashion. With help from the [url="http://docs.python.org/library/pickle.html"]Pickle[/url] library, we can store a wide variety of complete Python objects in the database. This particular example simply uses a Python list with a small variety of data types in it to show that a relatively complex object can be stored simply. However, this technique would work equally well for any object which was compatible with pickle, including complex user defined types.

Generally, I personally think it is better to break down an object into component parts and store those parts in SQL Server, rather than trying to store complex objects in a varbinary format. By breaking it down into native SQL types where possible, it makes it easier to examine those types and perhaps run analysis on them later. It can, sometimes, also make it easier to bring parts of that data into a different program without bringing the entire object in. [url="http://en.wikipedia.org/wiki/Object-relational_mapping"]Object-relational mapping[/url] tools like [url="http://www.sqlalchemy.org/"]SQLAlchemy[/url] can help with the conversion between program objects and native datatypes in SQL. However, there are certainly times when it makes sense to quickly place complex objects into a table with only minimal conversion.
[CODE]import pyodbc, pickle

#Create the SQL connection
sqlConnStr = ('DRIVER={SQL Server};Server=(local);Database=TestDb;'+
'Trusted_Connection=YES')

sqlConn = pyodbc.connect(sqlConnStr)

#Create a list with int, float, complex, string, T/F data, and a tuple
testList = [1, 2.1, 3+4j, 'a', True, (5, 6)]
llPickle = pickle.dumps(testList) #Then pickle it

#Store the list in the database
curs = sqlConn.cursor()
sql = """
insert into blobTest (dataDesc, binData)
values (?, ?)"""
curs.execute(sql, ('TestList', pyodbc.Binary(llPickle)))

sqlConn.commit()

#retrieve the Data
sql = """
select top 1 --top 1 in case this script gets run repeatedly
dataDesc, binData
from
blobTest
where
dataDesc = ?
"""
curs.execute(sql, ('TestList', ))
values = curs.fetchall()
newLL = pickle.loads(values[0][1])

print newLL[/CODE]
One significant thing to note is the call to pyodbc.Binary rather than just providing the variable as a parameter to the query. Omitting that step will result in an implicit conversion error. The call to pyodbc.Binary could be replaced with a call to the built in bytearray though. The pickling process is also necessary. Trying to pass the testList directly as a parameter to the query results in an “Invalid parameter type” error and trying to pass it to bytearray gives a TypeError.

[b] An Image in Python 2.7[/b]

While the Pickle Library can make it easy to store many complex objects, some types of objects cannot be immediately pickled.; So, other approaches are needed to put those into a varbinary field in SQL.

To take a straightforward example, here is a short Python script that makes a simple image file, in this case a straightforward blue square.;
[img]http://www.sqlservercentral.com/Images/17314.jpg[/img]

The script then stores it in the table, retrieves it and displays it using [url="http://code.google.com/p/pyodbc/"]pybodbc[/url] to interface with SQL Server, [url="http://www.pythonware.com/products/pil/"]PIL[/url] to create a simple image, and [url="http://docs.python.org/library/stringio.html"]StringIO[/url] to provide a buffer to hold the bmp version. It looks like
[CODE][b]from[/b] PIL [b]import[/b] Image #create the image
[b]from[/b] StringIO [b]import[/b] StringIO #stores the bmp for the image temporarily
[b]import[/b] pyodbc

#Create the SQL connection
sqlConnStr [b]=[/b] [b]([/b]'DRIVER={SQL Server};Server=(local);Database=TestDb;'[b]+[/b]
'Trusted_Connection=YES'[b])[/b]

sqlConn [b]=[/b] pyodbc[b].[/b]connect[b]([/b]sqlConnStr[b])[/b]

#Create a image to store and prepare it for storage
img [b]=[/b] Image[b].[/b]new[b]([/b]'RGB'[b],[/b] #mode
[b]([/b]320[b],[/b] 240[b]),[/b] #size
[b]([/b]0[b],[/b] 0[b],[/b] 255[b])[/b] #color, this is dark blue
[b])[/b]
imgBuffer [b]=[/b] StringIO[b]()[/b]
img[b].[/b]save[b]([/b]imgBuffer[b],[/b] "BMP"[b])[/b]

#Store the image in the database
curs [b]=[/b] sqlConn[b].[/b]cursor[b]()[/b]
sql [b]=[/b] """
insert into blobTest (dataDesc, imageData)
values (?, ?)"""
curs[b].[/b]execute[b]([/b]sql[b],[/b] [b]([/b]'Blue Image'[b],[/b] pyodbc[b].[/b]Binary[b]([/b]imgBuffer[b].[/b]getvalue[b]())))[/b]
sqlConn[b].[/b]commit[b]()[/b]

#release the memory
img [b]=[/b] [b]None[/b]
imgBuffer[b].[/b]close[b]()[/b] #release the memory

#retrieve the image
sql [b]=[/b] """
select top 1 --top 1 in case this script gets run repeatedly
dataDesc,
imageData
from
blobTest
where
dataDesc = ?
"""
curs[b].[/b]execute[b]([/b]sql[b],[/b] [b]([/b]'Blue Image'[b],[/b] [b]))[/b]
values [b]=[/b] curs[b].[/b]fetchall[b]()[/b]

#Bring it back as an image
imgBuffer2 [b]=[/b] StringIO[b]([/b]values[b][[/b]0[b]][[/b]1[b]])[/b]
img2 [b]=[/b] Image[b].[/b]open[b]([/b]imgBuffer2[b])[/b]

#Image.show is meant for debugging and testing.
#It may generate messages about 'sleep' not being recognized
#but it works nicely for a test script like this one.
img2[b].[/b]show[b]()[/b][/CODE]

Essentially, since a PIL image cannot be pickled directly, we are reducing it to a JPG which we are placing in the buffer provided by StringIO. While this is one convenient option, it is hardly the only one. Instead we could decompose it into tuple of the images mode, size, and data using the appropriate properties and that tuple would be fully compatible with Pickle. To return it to an image, we would use the [url="http://www.pythonware.com/library/pil/handbook/image.htm"]Image.fromstring[/url] method with that data.

Just like the last example, this script uses an explicit conversion with pyodbc.Binary to convert it to a bytearray before it is passed to SQL Server. Omitting that will generate an “Operand type clash”. Alternatively, in Python 2.6 or later a direct call to the [url="http://infohost.nmt.edu/tcc/help/pubs/python/web/bytearray-type.html"]bytearray[/url] conversion also works well.

Trying to use pyodbc.Binary (or bytearray) directly on the PIL Image generates a TypeError, which is why this script uses the StringIO library to reduce the PIL Image to a buffer which is then handed off.
[b] Conclusions[/b]

Generally, I think it will be better to store binary data, especially large binary data, outside of the database and instead include a pointer within the database. But it is certainly good to know how to store things like images or complete object-oriented entities in the database in a binary fashion for the cases where it is beneficial. When it is beneficial, Varbinary(max) provides a straightforward and flexible system for storing arbitrary binary objects inside the database, which permits it to store complex objects to be handed to an object oriented program, graphics to be displayed on a website, or sound files to be consumed by a player. Although varbinary almost always requires some sort of conversion to be useful and thus can require a bit more work than other datatypes, there are certainly cases where that can be well worth it.

Link to comment
Share on other sites

[b] Inside the SQL Server Query Optimizer - Free eBook[/b]

[b]Download eBook (PDF)[/b]: [b][url="http://www.red-gate.com/products/sql-development/sql-prompt/entrypage/sql-query-optimizer-ebook?utm_source=ssc&utm_medium=weblink&utm_content=queryoptimizer1&utm_campaign=sqlprompt"]Download here[/url][/b]
[b]Buy Printed Book[/b]: available on [b][url="http://www.amazon.com/Inside-SQL-Server-Query-Optimizer/dp/1906434603/ref=sr_1_11?ie=UTF8&qid=1301970045&sr=8-11"]Amazon.com[/url] [/b]or [url="http://www.amazon.co.uk/Inside-SQL-Server-Query-Optimizer/dp/1906434603/ref=sr_1_1?ie=UTF8&qid=1301970172&sr=8-1"] [b]Amazon.co.uk[/b][/url].
[b]Download ePub: [url="ftp://ftp.simple-talk.com/simplepod/InsideSQLServerQueryOptimizer.epub"]Download here[/url][/b]

Link to comment
Share on other sites

[b] Evaluating Policies on Multiple SQL Server Instances using Central Management Server[/b]


[b] Problem[/b]

So, you have created a SQL Server Policy to [url="http://www.mssqltips.com/sqlservertip/2060/identify-sql-server-databases-not-in-full-recovery-model/"]check database recovery models[/url]. Now you need to check the databases on all of your SQL Server instances. In this tip we will show how you can evaluate a SQL Server policy against multiple instances.
[b] Solution[/b]

To evaluate the policy against multiple SQL Servers you can use Central Management Servers or Registered Servers. In this tip we will evaluate a policy against SQL Server 2005, SQL Server 2008 and SQL Server 2012 using Central Management Servers.
[b] Prerequisites[/b]

To learn more about Policy Based Management please review this tip: [url="http://www.mssqltips.com/sqlservertip/1492/using-policy-based-management-in-sql-server-2008/"]Using Policy Based Management in SQL Server 2008[/url].

To learn more about Central Management Servers please review this tip: [url="http://www.mssqltips.com/sqlservertip/1767/execute-sql-server-query-on-multiple-servers-at-the-same-time/"]Execute SQL Server query on multiple servers at the same time[/url].
[b] Checking Policies Across Instances[/b]

After you have setup Central Management Servers and created the policy follow these steps:


[b] Step 1[/b]

In SQL Server Management Studio (SSMS) click the "View" menu and then click "Registered Servers"

[center][img]http://www.mssqltips.com/tipimages2/2859_IMG1.gif[/img][/center]


[b] Step 2[/b]

Right click on the server or server group under Central Management Servers and select "Evaluate Policies..."

[center][img]http://www.mssqltips.com/tipimages2/2859_IMG2.gif[/img][/center]


[b] Step 3[/b]

Select the source where the policy is saved:

[center][img]http://www.mssqltips.com/tipimages2/2859_IMG3.gif[/img][/center]



[b] Step 4[/b]

Select the policy you want to evaluate:

[center][img]http://www.mssqltips.com/tipimages2/2859_IMG4.gif[/img][/center]


[b] Step 5[/b]

Click "Evaluate" and then review the results as shown below:

[center][img]http://www.mssqltips.com/tipimages2/2859_IMG5.gif[/img][/center]

Link to comment
Share on other sites

[b] Advanced Logging - Stairway to Integration Services[/b]

http://www.sqlservercentral.com/articles/Integration+Services+%28SSIS%29/96657/

Link to comment
Share on other sites

[size=5]Precision Indexing: Basics of Selective XML Indexes in SQL Server 2012[/size]

http://www.simple-talk.com/sql/learn-sql-server/precision-indexing--basics-of-selective-xml-indexes-in-sql-server-2012/?utm_source=ssc&utm_medium=email-main&utm_content=precisionindexing-20130204&utm_campaign=SQL

Link to comment
Share on other sites

[b] [url="http://www.sqlservercentral.com/blogs/steve_jones/2013/01/28/granting-permission-to-grant-permissions/"]Granting Permission to Grant Permissions[/url][/b]

Link to comment
Share on other sites

[b] Backup and Restore Database Using Command Prompt – SQLCMD[/b]


Backup and Restore is one of the core tasks for DBAs. They often do this task more times than they would have ideally loved to do so. One thing I noticed in my career that every successful DBA knows how to automate their tasks and spend their time either playing games on a computer or learning something new!

Let us see how DBA can automate their task about Backup and Restore using SQLCMD. I am demonstrating a very simple example in this blog post. Please adapt the script based on your environment and needs.
[b] Here is the basic script for Backup and Restore[/b]

Note: name of my database server is [i]touch [/i]and I am connecting it with windows authentication.
[b] Backup[/b]

[CODE]
BACKUP DATABASE AdventureWorks2012 TO DISK='d:\adw.bak'
[/CODE]
[b] Restore[/b]

[CODE]
RESTORE DATABASE AdventureWorks2012 FROM DISK='d:\adw.bak'
[/CODE]
[b] Here is the basic script for Backup and Restore using SQLCMD[/b]

[b] Backup[/b]

C:\Users\pinaldave>SQLCMD -E -S touch -Q "BACKUP DATABASE AdventureWorks2012 TO DISK='d:\adw.bak'"
[b] Restore[/b]

C:\Users\pinaldave>SQLCMD -E -S touch -Q "RESTORE DATABASE AdventureWorks2012 FROM DISK='d:\adw.bak'"

[img]http://www.pinaldave.com/bimg/sqlcmd-backup-restore.png[/img]

Link to comment
Share on other sites

×
×
  • Create New...