I have an excel datasource which must be unpivoted, case statements run against it, joined with look-up tables to pull in additional values. Here's what I've got working so far:1. Excel Source transform2. Data Conversion transform (to handle for Unicode coming in from excel and sending to SQL)3. Unpivot transform4. Derived Column transform5. Look up transformI've performed the above steps (1-5). The look up transform is easy enough to configure ie. I supply the columns whose values must match and retrieve an associated value. But, the final output of the look up transform is now just the one value I looked up. Since I need all the values of the Derived Column output PLUS the looked up value from the look up transform to be merged together into final output, I have turned for a while to the Multicast and Merge transforms. But the Merge transform in particular requires so much niggling with Sort properties that before I proceed further I want to make sure I'm taking the best route with SSIS transforms.Will someone please advise on the best method for joining the output of the Lookup transform to the output of the Derived column which precedes it, besides multicasting and merging? Is there a better way to represent Left Joins in SSIS besides Lookup tables? Is lookup transform the best one to use if I need error handling to capture rows in source table for which there is no lookup value returned?
↧
please advise on transforms to use: Lookup and Merge? or something else?
↧
Dump table to multi-line CSV file....
I want to export a table to a CSV file using SSIS. Each table row has 2 "row type" columns indicating the data following it needs to appear on a new line in the file. ExampleCreate Table DumpToCSV(RT1 char(1) NOT NULL, -- RT = row typea1 char(1) NOT NULL,a2 char(1) NOT NULL,RT2 char(1) NOT NULL, -- another row type columnb1 char(1) NOT NULL,DateExported datetime not null)insert into DumpToCSV values ('A', 1, 2, 'B', 3, 2013-05-28)insert into DumpToCSV values ('C', 4, 5, 'D', 6, 2013-05-28)Desired single CSV output file: A,1,2 (split occurs at each "row type" column, A, B, C, D) B,3 C,4,5 D,6TIA for any ideas on how to tackle this with SSIS.Barkingdog
↧
↧
SSIS by using SP with multiple result sets
Hello, We have an SP which is developed to give us two result sets based on selection. I want to use that SP in SSIS package. This package just copies the results from SP and dumps it into a flat file.When I use that SP it is defaulting to first result set in OLE DB Source. So I created a new a SP which calls the first one to get the desired results. Not it shows me the correct fields which I can map to Flat File Destination.But when I execute the package it is throwing me error and found online that sometimes when we use ‘SET NOCOUNT ON’ it may give this error.Error: 0xC02092B4 at Data Flow Task, OLE DB Source [1]: A rowset based on the SQL command was not returned by the OLE DB provider.So instead of using the new SP, I declared a table variable and copied the results into it and thought of selecting it again through SQL command in OLEDB Source Editor. But when I do this it is not displaying me any results. Inorder to see if anything wrong with SP, I manually inserted records into table variable but it still not showing any results. The package runs fine, but the text file does not have any data. Can anyone please let me know if I’m doing anything wrong?First try with SP: It does not export any data to flat file.DECLARE @Details TABLE ( ProductNumber varchar(10) , CustomerNumber varchar(10) )INSERT INTO @DetailsEXEC [dbo].[TestSP]SELECT ProductNumber , CustomerNumber FROM @DetailsJust to see if it displays any results: It does not export any data to flat file.DECLARE @Details TABLE ( ProductNumber varchar(10) , CustomerNumber varchar(10) )INSERT INTO @DetailsVALUES ('12345','67890')SELECT ProductNumber , CustomerNumber FROM @DetailsFinally this one export data into flat file:SELECT '12345' AS ProductNumber ,'67890' AS CustomerNumber
↧
SSIS package reacts slow in BIDS
Hello,One of my packages in my SSIS solution is reaction very slow in the BIDS UI.Selecting a task in the Data Flow or the Even Handler tab takes minutes to accomplish.The other (smaller) packages in the solution react very well, but this one is quite challenging to work with.Any one an idea ?Regards,Franky
↧
SSIS and MSDTC on a SQL Authenticated Connection
Hi I have a package that has a container with a required transaction, meaning that the package starts an MSDTC transaction.I have been running this across two servers on the same domain - Server A and Server B using windows (SSPI) authentication on the connections. MSDTC is running on both machines configured to allow inboud transactions and everything runs Hunky dory. [i]However[/i] if i change one of the connections to be a server not on our domain requiring SQL Authentication (Server C) , I receive [b]"[Execute SQL Task] Error: Failed to acquire connection "CMS". Connection may not be configured correctly or you may not have the right permissions on this connection.[/b]. Server C is running MSDTC version 3.00.00.3535 (Windows 2000) which doesn't appear allow you to make an specific security configuration changes. To make things even more curious if I connect to server C in SSMS and write a piece of SQL that specificaly begins a distributed transaction, it manages to enlist DTC on Server C and i can see the the transaction beginning and committed in component services.Can anyone shed any light on this. I can only assume that SSIS is talking to MSDTC through another layer which it doesn't have rights to run.
↧
↧
Stored Proc with temp table issue.
Hi everyone.I understand that this topic was covered several times on this forum, but I can't seem to find ones matching my issues.... and am hoping to find some solution :(.I am trying to use stored proc created for report as a data source and move result into other database table.Store proc first generate #temp table, then creates #temp noncluster index, then select statement to call data from #temp table (with multiple joins).store proc sample:> create table #temp> index #temp table> select statement from #temp (multiple joins to other table)Store proc was developed for report and optimized for data read. However, when I try to use this stored proc as data source, I get 'Invalid Object name #Temp Table' error.I did some research, but only thing I see is mostly calling data stored in temp table and using temp table for the source. My case is bit different... temp table is used, but it goes through another multiple joins and returns result.Any suggestions?Thank you all for reading this!
↧
Execute Process Task to execute a .jar
I'm trying to run an Execute Process Task that initiates a java program. The error I receive when I run it in debug mode is: [Execute Process Task] Error: An error occurred with the following error message : “The specified executable is not a valid application for this OS platform.”. The Process fields in the Execute Process Task are filled out as follows: RequireFullFileName = TrueExecutable = C:\Users\john.doe.corp\Desktop\Calc\rollupcalc.jarArguments = java -classpath .;rollupcalc.jar co.corp.Application filename.csv 12WorkingDirectory = C:\Users\john.doe.corp\Desktop\CalcAll other fields are left at the default value. Any and all help will be greatly appreciated.HankL.
↧
Argument "Server1" for option "connection" is not valid. The command line parameters are invalid. The step failed.
I have a SSIS package run fine in BID, but when run as a sql agent job, it failed at the error:Argument "Server1" for option "connection" is not valid. The command line parameters are invalid. The step failed.In Job Step Properties, Configuration, Command files, Execution options, Logging, Set Values, Verification tabs do not have anything set. The only change made is in Data Source tab, the server name is changed to server2 ,so is in Command Line tab changed accordingly. But, somehow, it still points to the connection inside of the package. If I change the connection in the package, then create a new job point to the changed package, then it runs fine. Why changing DataSouce in the job properties will not take effect?Here is what is Command Line tab:/FILE "C:\ Conv_01.dtsx" /CONNECTION "Connection001";"\"Data Source=server2\SQL2008;Initial Catalog=FO;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;Application Name=SSIS-Package-{849C75AD-4693-4DF2-9EC7-73C347FEE10C}HEC_1.US-Iv2.5;\"" /CONNECTION " Connection002";"\"Data Source=server2\SQL2008;Initial Catalog=ULL;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;Application Name=SSIS-Package-{D399B84F-1467-41C3-A292-5B27789B66A3}HEC_EL1.ULL;\"" /CHECKPOINTING OFF /REPORTING E
↧
Unable to pass SQL execute Task Result set values to Data Flow component.
Hi All,I’m facing following issue in SSIS package, I need to get records in one table. Then each value should be passed to another query in data flow task. What I have done so far is,1) Created execute sql task with full result set as return, ' SELECT G.GROUPID FROM DBO.RPTGROUP G’2) Created For Each Loop and connect to 1) Foreach.ADO enumerator.ADO obj source ADO Object source variable selected in drop down list with enumeration mode.3) Created DF task, in that have OLE DB source query, I will be passing the result values as parameter.When I executing the package only 1 st value of SQL execute task value passed to for each Data flow source component & executed successfully.I’ve 3 values in SQL execute task output, but only one value currently passed to Data flow source.I’ve tested by pacing the script task inside the for each control flow, I will be getting 3 values from SQL execute task.So for each loop container working fine, but unable to pass all the values to data flow source. Please help me to pass all the result set value of the SQL task to for each container Data flow source.Thanks,
↧
↧
Execute java in Task
Hi, I have a jar file and i need execute it sentence:java -jar openpgp.jar -d "c:\example.asc" "c:\file pgp\readme.txt.asc" In Integration services, is there any task that execute this sentence?Thanks.
↧
FTP TASK Error.
I am having a problem with a package I created several years ago. I am using a SQL task that generates a list of unc path names to our mugshot jpeg's. That in turn is connected to a for each loop with an FTP task that copies the files to the web server. It generally takes .5 hrs for this task to complete and it is crashing about half way through with the following.SSIS package "FTP.dtsx" starting.Error: 0xC001602A at FTP, Connection manager "FTP Connection Manager": An error occurred in the requested FTP operation. Detailed error description: The operation timed out.Error: 0xC002918B at FTP Task, FTP Task: Unable to send files using "FTP Connection Manager".Warning: 0x80020918 at FTP Task, FTP Task: Operation "" failed. I suspect there is a file missing and when the FTP Task gets to that file it errors out because it can not find it. I know the task is copying files to the web server because I have deleted some files on the web server and the task puts them back. Like I said I did this several years ago and I am thinking there is a way to view the file names as they are being copied. Anyone have any idea how I can do this or some other way I can find the missing file.Thanks in advance.
↧
Execute Individual Tasks Without Debugging ?
I'm building a SSIS package with various steps to import CSV files, upload to temp tables, update, copy to other tables etc.I want to step through the package and execute 1 task at a time and verify it worked correctly. I can execute the task, but it seems to put it into debug mode and I have to stop debugging after each task runs before I can execute the next task. I don't care about debugging. Can I stop debugging from getting started so I can just run each task manually, check results, run the next task etc ... ?
↧
Derived Column to handled mulitple conditions in CASE statement?
HI, I've got a case statement that works, but because I have to do it in SSIS I am at a loss:---this works , CASE VW.LT when 'B1' then 'STD' when 'B7' then 'Q2FC' when 'B8' then 'Q3FC' when 'B9' then 'Q4FC' end as ValueTypeAs you can see, the case statement evaluates values in one column, and depending on what they are, renames them.I've pulled out the Derived Column transform to accomplish but unable to find a way to handle more than one condition.---this works([Copy of LT] =="B9")?"STD": "NonSTD"--but this doesn't work ([Copy of LT] =="B9")?"STD": ( [Copy of LT] == "B7"?"Q2FC":( [Copy of LT]=="B8"?"Q3FC":( [Copy of LT]=="B9"?"Q4FC")I need to handle all conditions. How to do it in SSIS?
↧
↧
ProcessInputRow failure
I need help trying to figure out what caused this error. I threw some Messagebox.Show in the script but they never get to appear? I'm not that familiar with scripts, I inherited this SSIS from previous developer. Unable to cast object of type 'System.String' to type 'Microsoft.SqlServer.Dts.Pipeline.BlobColumn'. at Input0Buffer.get_ENTR() at ScriptMain.Input0_ProcessInputRow(Input0Buffer Row) at UserComponent.Input0_ProcessInput(Input0Buffer Buffer) at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer) public override void Input0_ProcessInputRow(Input0Buffer Row) { MessageBox.Show("Processing"); agData += blobToString(Row.AG); carlData += blobToString(Row.CARL); fmnData += blobToString(Row.FMN); entrData += blobToString(Row.ENTR); } private string blobToString(Microsoft.SqlServer.Dts.Pipeline.BlobColumn blob) { string result = ""; try { if (blob != null) { result = System.Text.Encoding.Unicode.GetString(blob.GetBlobData(0, Convert.ToInt32(blob.Length))); } } catch (Exception ex) { MessageBox.Show("In the catch block" + ex); result = ex.Message; } return result; }
↧
Excel Datasource brings in a couple of numeric fields as nulls?
I have the Excel Connection Manager and Source to read the contents from an Excel file. For some reason couple of numeric fields from the Excel worksheet are brought over as nulls even though they have a value of 300 and 150. I am not sure why this is happening. I looked into the format of the fields and they are set to General in Excel, I tried setting them to numeric and that did not help.
All the other content from the excel file is coming thru except for the 2 numeric fields.
I tried to bring the contents from the excel source to a text file in csv format and for some reason the 2 numeric fields came out as blank.
Any inputs on getting this addressed will be much appreciated.
Thanks,
Manisha
↧
SSIS 2008 - Failed to lock variable Error (Oh, I fixed it)
I have a package in SSIS that I just created. It contains 2 ForEach loop containers. Within each container are 2 File System tasks with a precedence constraint between them. The first FS task copies a file to an FTP directory. The second FS task moves the file to an archive folder. (To reiterate, Copy then Move).The FS tasks are loaded with variables. I have 5 of them.1) OutboundDirectory - is used in the enumerator expression of the ForEach and gives the directory to be searched for the file.2) xx_FileName - populated by the ForEach container with the fully qualified path name of the discovered file, then used as the Source variable for both FS tasks. (there are two of these, one for each ForEach container. They are named differently. "xx" is a stand-in value.).3) FTPDirectory - used in the FS copy task as the Destination variable.4) ArchiveDirectory - used in the FS move task as the Destination variable.The kicker here is if I have one copy of a file to be processed, the entire package works fine. But if I have multiples of a file (ABC_20111002.txt, ABC_20111003.txt, etc.), then the first execution of one container works fine. File is copied, then moved. But the execution of the other container fails on the FS move task with the following error:[quote][b]SSIS Error[/b][hr]Error: 0xC0014054 at Archive Feed: Failed to lock variable "User::ArchiveDirectory" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".Error: 0xC002F304 at Archive Feed, File System Task: An error occurred with the following error message: "Failed to lock variable "User::ArchiveDirectory" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".".[/quote]There appear to be multiple solutions on Google, few of which reference my specific scenario. All of them seem to indicate a complete failure of the package rather than a success, then failure. Also, many of them advise changing the variables to Read Only, which I tried. But this fails my entire package, so it's a no-go as far as I'm concerned.................................[b][u]MAJOR EDIT[/u][/b]Just as I was writing this, I realized what my problem was. I was using the same variable in two different containers and two different threads were trying to write two different file names to it. For example, I have files that start with ABC_ and files that start with DEF_. I need 2 different containers because you can't put both file names in the enumerator without including files that you may not want to pick up. The Destination variable of a File System Task will try to take both ABC_20111005.txt and DEF_20111005.txt and write them both to the ArchiveDirectory variable. So you might have \\MyServer\Archive trying to get overwritten and used (at the same time) by both file names.SSIS apparently doesn't like this. So you need two separate variables pointing to the same path name for each individual container to use. And that does work. I've verified it.I've decided to go ahead and post this because when I searched for this error, I could not find anything remotely similar to my situation. I will also be posting this on my blog for future reference. I hope this helps someone else out.
↧
how to use parameters in SSIS update query having CASE[ Giving Error]
My below query is giving error while writing in query editor [ SSIS].[quote] declare @janflag as varchar(56) set @janflag ='y' declare @febflag as varchar(56) set @febflag ='y'/[ SELECT TOP 1 febfl FROM saupdate ] * saupdate table in the control flow. UPDATE sacustomer SET salesamt1 = CASE WHEN (@janflag='y') THEN ? ELSE salesamt1 END , costamt1 = CASE WHEN (@janflag='y') THEN ? ELSE costamt1 END , qtysold1 = CASE WHEN (@janflag='y') THEN ? ELSE qtysold1 END , salesamt2 = CASE WHEN (@febflag ='y') THEN ? ELSE salesamt1 END , costamt2 = CASE WHEN (@febflag ='y') THEN ? ELSE costamt1 END , qtysold2 = CASE WHEN (@febflag ='y') THEN ? ELSE qtysold1 END where cono=? and yr=? and divno= ? and whse= ? and custno= ?[/quote]Not sure about the declare code...please check it and help me .Thanks
↧
↧
SSIS variables not getting set when executed in a SQL job
I have a package that loops through a directory and stores all the file names in that directory to a variable in my package called "File". After processing files they are then moved to an archive directory. Very simple. Works fine running locally in BIDS, but as soon as I try to run it in a SQL job, the variable is not getting set. I know this because it is just grabbing the default excel file path and not the path i am trying to set with my variable "File" For Each fileName As String In Directory.GetFiles(path) Dts.Variables("File").Value = fileName.ToString() Next EcelFilePath = @[User::File]Again, it works fine locally. Ive tried using proxy account. Banging my head against the wall for 2 days on this. PLEASE HELP!!
↧
no matter how large the destination column eg. nvarchar(max), excel source choke on column
I'm using VS2010 BIDS, importing from Excel 97-2003 .xls worksheet. I've got the following config:Excel Source -> Conversion Split Transform -> OLE DB destination. As long as I exclude the column in question the package runs and all columns import. However, if I enable the mapping for the column in question the excel source chokes on it. The column in the source document contains letters, numbers, hash marks (#), and dollar signs($) etc. So, I have the destination column set to nvarchar(max) and have also tried nvarchar(255) up to nvarchar(800) but this column just won't import.Within excel I have tried setting the column in question from General to Text format with no improvement. Because excel source chokes on this particular column I have found that adding a data conversion transform and doing a redirect on failure after the excel source transform to be useless. Can you provide suggestions or explanation for what could be going on with this excel source column that makes ssis excel source transform choke on it???Thanks in advance:crying:
↧
how to write expression for Derived Column transform
I need to filter out all records as follows: where [Month 01] is null and [Month 02] is null and [Month 03] is null and [Month 04] is null and [Month 05] is null and [Month 06] is null and [Month 07] is null and [Month 08] is null and [Month 09] is null and [Month 10] is null and [Month 11] is null and [Month 12] is null The above is t-sql. I want to do in Derived Column. Tried the following but it's not parsing:[Month 01] !=NULL &&[Month 02] !=NULL] && etc.How to do the same in SSIS in Derived Column transform?
↧