Skip to main content

SSIS.Pipeline Description: "component "Excel Source" (570)" failed validation and returned validation status "VS_ISBROKEN"



Back Drop
                Being a rookie at SSIS  I managed to complete my first SSIS package for a client. The package included loading data from EXCEL data and CSV  files to SQL server and subsequently loading to the datamart.

Problem
                After deploying the package at the client environment the initial runs worked seamlessly but when the larger files were tested the a validation error stated to pop out. This was utterly confusing and for few hours I dint realize what the error message said in  plain text.

Note
 The current message is last of the validation errors and there was another ballistic message that appeared before this.

The error message
Executed as user: Microsoft (R) SQL Server Execute Package Utility  Version 10.50.1600.1 for 32-bit  Copyright (C) Microsoft Corporation 2010. All rights reserved.    Started:  1:28:45 p.m.  Error: 2012-10-05 13:29:38.05     Code: 0xC0202009     Source: Import XLSX Files Excel Source [570]     Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.  End Error  Error: 2012-10-05 13:29:38.09     Code: 0xC02020E8     Source: Import XLSX Files Excel Source [570]     Description: Opening a rowset for "browsing_details$" failed. Check that the object exists in the database.  End Error  Error: 2012-10-05 13:29:40.10     Code: 0xC004706B     Source: Import XLSX Files SSIS.Pipeline     Description: "component "Excel Source" (570)" failed validation and returned validation status "VS_ISBROKEN".  End Error  Error: 2012-10-05 13:29:40.15     Code: 0xC004700C     Source: Import XLSX Files SSIS.Pipeline     Description: One or more component failed validation.  End Error  Error: 2012-10-05 13:29:40.21     Code: 0xC0024107     Source: Import XLSX Files      Description: There were errors during task validation.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  1:28:45 p.m.  Finished: 1:29:40 p.m.  Elapsed:  54.812 seconds.  The package execution failed.  The step failed.


What I did
-          Removed all the spaces from the file name
-       Removed the leading numeric from the excel tab name ( it later occurred to me after speaking to my friends that SQL server does not like table name to have leading numeric)

Comments

  1. Later i realized that tab name names neede to be the same; in this instance "browsing_details" had to exists to proceed beyond the validation errors

    ReplyDelete

Post a Comment

Popular posts from this blog

Create a dacpac To Compare Schema Differences

It's been some time since i added anything to the blog and a lot has happened in the last few months. I have run into many number of challenging stuff at Xero and spread my self to learn new things. As a start i want to share a situation where I used a dacpac to compare the differences of a database schema's. - This involves of creating the two dacpacs for the different databases - Comparing the two dacpacs and generating a report to know the exact differences - Generate a script that would have all the changes How to generate a dacbpac The easiest way to create a dacpac for a database is through management studio ( right click on the databae --> task --> Extract data-tier-application). This will work under most cases but will error out when the database has difffrent settings. ie. if CDC is enabled To work around this blocker, you need to use command line to send the extra parameters. Bellow is the command used to generate the dacpac. "%ProgramFiles...

High Watermarks For Incremental Models in dbt

The last few months it’s all been dbt. Dbt is a transform and load tool which is provided by fishtown analytics. For those that have created incremental models in dbt would have found the simplicity and easiness of how it drives the workload. Depending on the target datastore, the incremental model workload implementation changes. But all that said, the question is, should the incremental model use high-watermark as part of the implementation. How incremental models work behind the scenes is the best place to start this investigation. And when it’s not obvious, the next best place is to investigate the log after an test incremental model execution and find the implementation. Following are the internal steps followed for a datastore that does not support the merge statements. This was observed in the dbt log. - As the first step, It will copy all the data to a temp table generated from the incremental execution. - It will then delete all the data from the base table th...

How to Backup postgres globals without sysadmin permission in RDS

For those of my Friends who are on postgres RDS and require to backup the globas for a rds instance you would soon find it can't perform a backup with pg_dumpall --globals-only.  The default user that gets created along with the RDS postgres instance does not have adequate permission to get through the backup processes. Bellow is the message that you would receive. pg_dumpall.exe : pg_dumpall: query failed: ERROR:  permission denied for relation pg_authid At line:1 char:1 + & 'C:\setup_msi\postgres\bin\pg_dumpall.exe'  -h XXXXXX.cXXXX... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~    + CategoryInfo          : NotSpecified: (pg_dumpall: que...ation pg_authid:String) [], RemoteException    + FullyQualifiedErrorId : NativeCommandError pg_dumpall: query was: SELECT oid, rolname, rolsuper, rolinherit, rolcreaterole, rolcreatedb, rolcanlogin,...