Just in a situation when i was in need of a guide to help me in getting a Bulk import/export done i was struck as i was not getting one good article that can help me in doing. It was all in bits, hard to get them all in one place and get what i expect. Let me take up this and get it down here, hope this helps for a better understanding.


We had some data in a flat file populated with the and ETL (Informatica) tool and now i have to load the same data to my SQL database table as a bulk load. And to load your data as bulk we have the BCP utility that comes in handy in allowing bulk load from data file to the SQL table and vice verse.

We have both import and export commands in BCP to load data from and to the SQL table. The standard BCP command syntax as follows,

bcp {db_name.schema_name.Table_Name} in/out {File_Path\file_name.txt} -S {server_name} -t ‘field_terminator’ –T

here we have,
-S -> Server Name
-t -> filed terminator (example “/t” for tab delimited, “,” for comma delimited)
in/out -> ‘in’ for import from data file to sql table and ‘out’ for export o data file from the sql table.
-T -> to allow a trusted connection (for non trusted connection we define ‘-U’ user_name and ‘-P’ password)

This is the simple bcp command that can help you load data faster and in an elegant way.

There is one add on feature or a parameter that you can define for the bcp command which can make the load from and to the data file more tightly mapped and Provides a flexible system for writing data files that requires little or no editing to comply with other data formats.

We need to create a format file before loading the data with the bcp command. And the syntax as follows,

bcp {db_name.schema_name.Table_Name} format nul -c -f {Format_File_Path\file_name.fmt} -S {server_name} –T

format nul -f -> to define the path and the format file name to be created (.fmt or .xml file)

We can have both an xml and a non xml format file, here i go with a non xml format file which gives a structure that looks as


Hope this helps in getting the bulk load for your data to and from a data file to the sql table. Learn and share 🙂

Just I was done with my SQL Server installation  all that I need now is a sample database to work on. And this requires to download the sample DB that is available from the Microsoft Corporation to be used as sample DB.

All I had to do was to follow the below simple sequence of steps,

  1. Download the AdventureWorks 2008 OLTP Script.zip, @ the link http://msftdbprodsamples.codeplex.com/releases/view/89502,
  2. Copy all the files in the downloaded zip to any folder location, I just copied the files to the location – ‘C:\AdventureWorks 2008 OLTP Script’
  3. And now all we have to do is open the ’instawdb.sq’ at the location C:\{AdventureWorks 2008 OLTP Script Path}\ with the SQL Servre management studio.
  4. The script available has to be edited as per the need, as I have done below, uncomment the variables
  5. Set the variable SqlSampleDatabasePath to the Data folder location which is mostly by default “C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\” and also set the SqlSampleSourceDataPath to “C:\AdventureWorks 2008 OLTP Script\”, the location where you have your .CSV files placed.
  6. Do not forget to set the Query mode in Query menu to SQLCMD mode as below,
  7. Execute the query now and we are done, to see the sample database AdventureWorks2008 installed into our SQL Server. Share this to help some one.

    This results in the AdventureWorks2008 sample Database added to our SQL Server,


I was just getting my SQL Server 2008 setup on my desktop for the second time and I had lots of confusions and the take off was never going good, finally was able to get it done. So decided to document the whole of the process step by step for others to make the best use of this.

As usual start up the set up clicking on the setup.exe file,

Then we have the installation wizard to follow by,

Go as per the screen shot, click on the installation to launch a new instance of the SQL server,

Get the Product key that comes with your pack and type it as shown,

Then list of setup rules appear, with an exception on the warning of the Firewall setup, which can be ignored and can be fixed once our setup is installed.

Then select the type of installation. Preferably ‘All Features with Default’ would be good to go with,

Let all the features be selected,

Instance configuration, let this be the default MSSQLSERVER,

For the server configuration set the Account types as below, which should be fine to go with and  let the start up type be Automatic,

In the Database Engine Configuration, lets have a mixed mode to enable both Windows and SQL Server authentication.

Either add the current user or can add a new user by clicking on ‘Add..’ and then click on ‘Advanced..’ followed by ‘Find Now’ and select the user as required.

In case of Analysis services Configuration similar users can be added following the same series of steps,

For Reporting Services Configuration select ‘Install native mode’,

And we are almost done, give Next and then Install..

We are done 🙂 and here appears the last screen of the installation showing the success message,

Please post your queries and comments and share to help someone.

This was an interesting SQL concept i came across to check for the Distinct of records and get a Distinct A,B,C and also to get the Distinct (B), from the result of distinct combination. I am sure i have confused you!!

OK.. let me put this in a more simple form, We derive at combination of distinct Product, Price, Size. This shall give a result something as below,

And to get the same scenario in my sample table i query into the Product table, taking only a single product which gives all the distinct combinations as below (taken only one product for instance here),

But what if i need the one combination to get the one with the maximum price, which can be attained using the Row-Number() function as in my sample query,

SELECT ProductID,ReferenceOrderLineID,ReferenceOrderID, ActualCost
(SELECT Production.TransactionHistory.*,
FROM Production.TransactionHistory) AS t

And here we go with the distinct and only transaction of every product with the maximum cost as is shown below,

So we can use the same where the Distinct and group by clause might be help less. Thanks for a read, share and enjoy 🙂

This can be a bit interesting thing that you can go for while adding an Index as required for your update session. As per our practice for better performance in our Informatica we prefer adding an index to be created while performing the dtata load. This is usually achieved through adding of an Index in the Pre SQL in the Target transformation (before loading into the target table) and then finally dropping the same Index with the one defined in Post SQL of our target transformation.

This was all going good with the index creating, session succeeding and finally the Index dropping after target loading. But then i face one scenario, where my target load fails of some reason, here the session completes with no execution of Post SQL, that is the index is not dropped. Well then when i executed the same session all again, to my surprise it Fails, showing the Index aloready exists!! Well well that was all because my index that was created in the last run was not dropped in the last time failed run.

Hmmm.. this is a interesting scenario and requires a permanent solution rather than a human intervention to drop the index every time the session fails. Then on my team mate’s advice i was able to give a stop to this issue by a simple SQL query which first checks for the availability of the index and then goes for creating or dropping before the Creation. The Pre SQL then shall look something like below,

(SELECT name FROM sysindexes WHERE name = ‘index_update’)
Drop index [index_update] on [Sample_Table]

CREATE NONCLUSTERED INDEX [index_update] ON [Sample_Table]
( [Column_Name] )

 Looks simple, but a very crucial one to have a smooth going to your workflow running. And as usual keep the Drop index as normal in the Post SQL,

Drop index [index_update] on [Sample_Table]

Thanks for a read, Just share with others as this may help someone.. as i did 🙂

I had the whole stage set for my SQL Server 2008 R2 installed and ready to use. But this did not get a relief, there was something that kept be pulling my hair for almost 2 full week ends after doing all the installation L

I was in a struggle in getting my SQL server on track to connect to the database engine!!

It was all saying – “error: 40 – could not open a connection to SQL Server”.


All I had to do was to look for the SQL server services, Right click on your MyComputer(Computer) -> Manage -> Computer manager. And here we go to the Services and Applications and then to SQL Server Configuration Manager to find the below settings under SQL Server Services,


So this is the root cause making us not to get logged into the Database Engine, and a simple solution have this set is to right click and look for the properties


Then all you have to do is to start the services and give your database connection a one more try, and in case you find any problem with the login user simply give the User Name as ‘sa’ and a blank password with the SQL Server Authentication. This must resolve things and get you logged in. Finally my wait gets over after almost 2 full weeks 🙂

Thanks for reading, share this and it must help some one out.


Hella all!!

Just to start with one SQL requirement i met with @ work..

Requirement: I had to make a Batch script to run my SQL query at the server, which can be very well executed with the query feeded hrough a .sql file and execute the
.sql file through a batch file having a sqlcmd command to execute. But but.. it goes like i had to run multiple .sql files having different query sets in them to make sure i don’t execute all of them together and has to be one after the other which can avoid any cases of SQL blocks occuring in my Server.

Solution: On continuous googling i witnessed different solutions, one saying to run the sql command with a input file to be given as PATH/*.sql, but that runs all the SQL files in the directory which again proves a not right.

And here i got this one good solution multiple selected query files.

Step1: Get all your SQL query files read, as i have done here with 4 files as sql_query1.sql, sql_query2.sql, sql_query3.sql, sql_query4.sql and sql_query5.sql.

Step 2: Get your master SQ file with the follwoing script, i have named it as master_sql.sql

— This is the main caller for each script

:r c:\Scripts\sql_query1.sql
:r c:\Scripts\sql_query2.sql
:r c:\Scripts\sql_query3.sql
:r c:\Scripts\sql_query4.sql
:r c:\Scripts\sql_query5.sql


Step 3: And the finally all you got to do is get your batch file ready to call the master sql file, i have named this as call_master.sql

set Server_Name=sqlserver_name
set Database_Name=Advetureworks
sqlcmd -s %Server_Name% -d %Database_Name% -e -w -s, -b -i c:\Scripts\sql_query5.sql > c:\Scripts\output.txt

Step4: Just execute your .bat file and see the query working smooth and swift 🙂

Share and let others know because sharing is good.

%d bloggers like this: