Skip to main content

SSIS Job fails when it calls Excel via the SQL Agent but succeeds from SSDT

If you have an SSIS package which fails when run on a schedule but succeeds when executed interactively in Visual Studio/BIDS/SSDT, it may produce an error like this:

Executed as user: DOMAIN\user. Microsoft (R) SQL Server Execute Package Utility  Version 12.0.4100.1 for 64-bit  Copyright (C) Microsoft Corporation. All rights reserved.    Started:  10:52:25  Error: 2017-02-06 10:52:26.26     Code: 0x00000001     Source: Open Spreadsheet and Run Macro      Description: Exception has been thrown by the target of an invocation.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  10:52:25  Finished: 10:52:26  Elapsed:  0.485 seconds.  The package execution failed.  The step failed.

The issue is a type of permissions error. Excel needs to have its permissions changed via the DCOM applet in Control Panel. By default it is on “The launching user.” This needs to be changed to a user with more permissions, in this case we have used the service account used by SQL Agent.


Comments

  1. Great post! This is very useful for me and gain more information, Thanks for sharing with us.

    Article submission sites
    Guest posting sites

    ReplyDelete

Post a Comment

Popular posts from this blog

How to configure the SSAS service to use a Domain Account

NB Updating SPNs in AD is not for the faint hearted plus I got inconsistent results from different servers. Do so at your own risk! If you need the SSAS account on a SQL Server to use a domain account rather than the local “virtual” account “NT Service\MSSQLServerOLAPService”. You may think you just give the account login permissions to the server, perhaps give it sysadmin SQL permissions too. However, if you try and connect to SSAS  remotely  you may get this error: Authentication failed. (Microsoft.AnalysisService.AdomdClient) The target principal name is incorrect (Microsoft.AnalysisService.AdomdClient) From Microsoft: “A Service Principle Name (SPN) uniquely identifies a service instance in an Active Directory domain when Kerberos is used to mutually authenticate client and service identities. An SPN is associated with the logon account under which the service instance runs. For client applications connecting to Analysis Services via Kerberos authentication, th

How to move the Microsoft Assessment and Planning Toolkit (MAP) database to a different drive

The Microsoft Assessment and Planning Toolkit (MAP) is a very useful tool for scanning your network to find instances of SQL Server plus all manner of detailed information about the installed product, OS and hardware it sits on. <Click image to enbiggen> There is an issue with it the database it uses to store the data it collects, however. Assuming you don't have an instance called MAPS on your server, the product will install using LocalDB (a cut down version of SQL Server Express) and puts the databases on your C: drive. If you then scan a large network you could easily expand the database to 10GB which may cause issues on a server when that drive is often one of the smallest. However, there is a simple solution: connect to LocalDB using Management Studio, detach the databases, move to a different drive, set permissions on the new location if required and reattach the database. How do you connect to LocalDB? Here you go: Connect to (localdb)\MAPTOOLKIT The

How to import a large xml file into SQL Server

(Or how to import the StackOverflow database into SQL Server) Introduction NB  This process can be generalised to import any large (>2G) xml file into SQL Server. Some SQL Server training you can find online including that by Brent Ozar uses the StackOverflow database for practice. The tables from it are available online for download in xml format. In the past it was possible to use the scripts found here, https://www.toadworld.com/platforms/sql-server/w/wiki/9466.how-to-import-the-stackoverflow-xml-into-sql-server , to import them but as each xml file is now over 2GB you will get an error like this when you try to execute them: Brent Ozar, has a link to SODDI.exe, https://github.com/BrentOzarULTD/soddi , which can import the files (I haven’t tried it) but it means downloading and importing eight tables: Badges, Comments, PostHistory, PostLinks, Posts, Tags, Users, and Votes tables which amounts to >30GB of compressed xml increasing to ~200GB when decompre