Skip to main content
Submitted

Ability to include powershell script actions in TX

  • August 9, 2017
  • 7 replies
  • 5 views

Forum|alt.badge.img

We have a data source (Afas "Profit", Dutch ERP system) that we can only access through a SOAP API. This API allows you to do a call and receive a table in XML format.

Calling this SOAP API is quite easy from powershell, however it is not possible to run the powershell code from SQL Server (xp_cmdshell is not an option for us)

Besides this, having code outside of TX is a path we don't we to go, as it puts us back in the world of manually deploying files. 

It would be great if we could deploy and run powershell scripts from TX.

This would make sense as an option in "Script Actions", e.g. in the right click menu there are 3 options:

* Add Custom Step

* Add FTP Source

* Add Powershell script

 

(Of course, even better would be if TX supported Afas Profit out of the box)

7 replies

Forum|alt.badge.img
  • Author
  • Known Participant
  • August 10, 2017

Hi Jan,

Thanks for your response.

I'd be very interested in collaborating on a custom data source, but haven't been able to find any TX documentation on this.

The process is rougly as follows.

The ERP administrator creates a so called "Get Connector" which is a sort of query definition in the application.

In a SOAP call to the GetConnector endpoint, I pass the connector name and some options, in XML format like the following (simplified)

<GetDataWithOptions><GetConnector>XYZ</GetConnector><skip>0</skip><take>1000</take><options><outputmode>xml</outputmode><metadata>0</metadata><filter id="1" operator="gt" field="timestampfield">2017-08-10T00:00:00.00000</filter></options></GetDataWithOptions>

The documentation for doing the call is public:

https://static-kb.afas.nl/datafiles/help/2_9_7/SE/EN/index.htm#App_Apps_Custom_Get_Call.htm

The skip/take construction is mandatory and results in a batched download, i.e. you have to keep calling with increased <skip> value until you get no rows.

The return payload has a tabular structure with a root element that has a child for each row, which in turn has a child node for each column.

<AfasGetConnector>

<GetConnectorName><Column1>value1</Column1></GetConnectorName>

<GetConnectorName><Column1>value2</Column1></GetConnectorName>

<GetConnectorName><Column1>value3</Column1></GetConnectorName>

</AfasGetConnector>


Forum|alt.badge.img
  • Known Participant
  • September 20, 2017

Powershell could be very useful for managing Azure SQL Databases... scale it up before deploying and executing... scale it down when complete.

Another use would include copying text and Excel files locally.


Forum|alt.badge.img

Bringing this back up again for a couple different scenarios. 

1.

Scaling Azure SQL DB instances with SQL is problematic.  I can use a SQL statement to scale a DB up prior to the load, but we are unable to create a loop and wait for the DB to finish scaling.  The problem is that we can wait.... but the session disconnects when the larger DB is ready and Azure SQL switches the connection.  The SQL Stored Proc is never able to determine if the database scaled successfully.

A Powershell script could be called in order scale the DB as an alternative and since it's session will not be disconnected during the resizing process, it would allow for a scale, wait, confirm, and continue process.

This could be implemented as an external executable for execution packages, but that would require multiple execution packages to properly sequence the scale up, load, scale down process.

If Powershell was available as a type of Script Action which could be called at the table level, it would also satisfy the requirement of a client of ours as well.  (See #2)

2.

My client has written C# processes to extract data from another system.  Ideally, these extracts needs to be a part of the Discovery Hub load so the data in refreshed in a timely manner without trying coordinate many different schedules. 

Their C# process to pump data into custom tables may be rather unique to the user community except for a mention here:

https://support.timextender.com/hc/en-us/community/posts/360033987671-Possibilty-to-execute-scripts-like-python-powelshell-as-a-custom-script-execution?input_string=Powershell%20Script%20Actions

However, this is something else which Powershell could help with instead of creating many clumsy SSIS Packages and External Executables.  Custom SSIS Packages really are not an option for them since there would be so many to create.

SSIS *Could* be an option IF the external executable SSIS Package calls allowed parameters to be passed in.  Then a more generic SSIS package could be created to execute a process for which the name is passed.  This however doesn't help in an Azure scenario where SSIS isn't available.

 


  • Participating Frequently
  • December 4, 2019

Great points David!

Aside from what you write, managing SSIS packages alongside TX is a configuration management nightmare (versioning / deployments). Not to mention that we don't have an SSIS server :)

Having support for scripting languages (python, c#, etc) would solve 2 huge product gaps at once

1. Embedded operations actions, such as up- and downscaling azure SQL

2. Interacting with complex source systems

I've written a few custom discovery hub adapters for the latter scenario but they require a dramatic amount of code to do things that python does with 5 lines of code.


I know, that this might not be exactly what you are looking for, but there are alternative ways to execute an external task as part of the flow in TimeXtender.

1/ You could use CLR, though it is quite a lot of work, especially when you have multiple environments

2/ xp_cmdshell is also an option, that get things done, but make your server slighty more vulnerable.

3/ SQL server agent. My new best friend :-) The example below will create a simple SQL server agent job, which is then started and deleted upon completion. The benefit of this solution, is that you are using standard SQL features and it will survive a transfer between environments.

The custom scripts can be executed as a pre-script on the ODX or on a selected table in the DSA etc.

DECLARE @job NVARCHAR(100) = 'TX jobName - ' + CONVERT(NVARCHAR, GETDATE(), 121) ;
EXEC msdb..sp_add_job @job_name = @job,
@description = 'Automated job to execute command shell script',
@owner_login_name = 'sa',
@delete_level = 1 ;

EXEC msdb..sp_add_jobstep
@job_name = @job,
@step_id = 1,
@step_name = 'Command Shell Execution',
@subsystem = 'CMDEXEC',
@command ='C:\Local\test.bat',
@on_success_action = 1 ;

EXEC msdb..sp_add_jobserver @job_name = @job ;
EXEC msdb..sp_start_job @job_name = @job ;

I have used this solution for some time now, and it seems pretty reliable. To see the actual job that is created in SQL server agent, you can comment out @delete_level.


Forum|alt.badge.img

Martin,

Agreed that these are options for on-prem SQL Server, but not for Azure SQL services.


What if you don't have 'sa' access?