Build better products with our product team
I’m a big fan of the recent improvements that you have made to the dashboards, it allows me to do in-depth analytics and generate the reports that I use in my QBRs. It would be even better if we have more filtering options available such as:Company Sign up date Age AccountRight now I need to get this data elsewhere and combine data sets.What do you guys think?
Add feature to enable table partitioning in SSAS tabular models.
Hi, At the moment scheduling ODX task is done using UTC time but Package execution schedule to run on local time As there is clearly dependency between the two, it makes it extra difficult to synchronize the task execution and the package execution. My suggestion is have the ODX task schedule also run on server's local time. Or alternatively enforce UTC time on package execution. Ideally would be to integrate the ODX and Package executions into a single scheduler container.
Hello TimeXtender Team, we have a lot of queries regarding the connection with the Theobald IS DELTAQ functionality, this is a much needed feature on DiscoveryHub! Unfortunately the way through SSDT is often a rather daunting option.Is there already a plan to implement this or all Theobald IS functions in the DiscoveryHub? Regards
It would be great to be able to publish the semantic layer to a PowerBI (premium) xmla endpoint. https://docs.microsoft.com/en-us/power-bi/admin/service-premium-connect-tools Why should you build this: Right now it is possible to publish the semantic layer to a SSAS tabular model (either in the cloud or on prem). We often use this as we love the fact that we manage RLS and security, as well as our models fully in Timextender. WE use SSAS in Azure when we want to share with users outside of our own organization. The new possibility to publish straight to powerbi premium, means that we can save a lot of cost in SSAS Azure. It adds to the value of TX, as the only interface that's necessary to publish data to PowerBi reports. It also seems to be quite a small change (from my perspective) It looks like you are already using XMLA to push the semantic layer to SSAS on azure so i hope this would be relatively simple, although i am happy to be corrected if that is not the case. finally: read this excellent blog post on why/how you would want to migrate to Premium here: https://data-marc.com/2020/06/16/migrate-analysis-services-models-to-power-bi-using-tabular-editor/ It features a few more links at the bottom that are also really valuable to read.
About Azure key vault: https://docs.microsoft.com/en-us/azure/key-vault/basic-concepts We - as well as many other - add signin information (either just username and password, or whole connection strings) to Azure key vault. Example user scenario: You have a script that need to connect to a database and fetch some data. As you are concerned with security you do not want username and password to be written in the script. Instead you use a secret, stored in key vault, to get a connection string that the script can use. You, as a user with access to keyvault, run the script. The script connects to keyvault and fetches the connection string and uses it to connect to the database. No password or username is floating around. If the username or password changes, the connection string is updated in the keyvault and everything continues to work. For Discovery Hub the scenario is mutch the same - so Discovery Hub should just store the reference to the keyvault and get the password (or the whole connection string) from the keyvault. If the username or password changes - it is updated in the keyvault - and Discovery Hub will continue to work - as it get's the username/password from the key vault. This would free us from needing to update username and password in Discovery Hub when needs to rotate or by other means change the passwords. Best regards, Trond-Eirik
It would be much easier to order the steps in the execution package, if we were able to drag and drop the objects around.
Please add an ability to convert any useful instruction or answer provided by any community member to Article.Actually it is one of trending features of modern communities - to collect community knowledgebase by users instructions which can solve a lot of cases then. It would be nice every article has its own direct link to have an ability to provide it to customers.This ability allows to save moderator's time (he/she does not need to create such articles him-/herself). Also users trust other users more than brand representatives. And also it might be used as an additional engagement approach - user's whose posts become articles feels respected and/or might be promoted.
Is it possible in the Multi-Environment to transfer just one Project Perspective. This could be really usefull when you are developing with multiple persons one project. Not all Project Perspective are at the same time ready to transfer from development/test to a production server
It would be really nice if we could make the email notification an execution package uses dependent on environment. Right now, if you want to vary say, the recipients of a success or failure message based on environment, you must create a separate version of each package for every environment you have. If this was combined with Ulfar's suggestion here, then the entire execution package concept would be hugely streamlined. As it is, it's not unusual to need at least one execution package per environment per business area, often leading to having to create and maintain dozens of mostly redundant execution packages. I think implementing these two solutions would be a huge "quality of life" improvement for the tool.
Add an option to generate documentation for a specific project perspective.
Power BI and SSAS Tabular allow you to use a setting to apply Row Level Security on relationships that are bi-directional, or filter both tables. This feature is needed in TimeXtender's semantic layer.
Just like we have Business Functions for the unfamous "Mind Destroying Expressions" (MDX) ;-) https://support.timextender.com/hc/en-us/articles/210439443-Business-Function-Library It will be pretty cool to offer the same library parametrization support for DAX that's available for MDX on the OLAP MD semantic model.
I loaded some records from a view which obviously is built off of a table. When I looked at the records in the destination table, the DW_SourceCode field had the value 'View'. I was expecting the underlying DW_SourceCode field from the source table. Response from TX: The current behavior of custom views is to simply populate the destination table DW_SourceCode with the value "View." Unfortunately, you can't map any field onto DW_SourceCode in the destination table, but you can add it as an additional custom field. This feature would be nice for tracking and being able to query the underlying source dataset.
When source tables are heavily used and ODX Server performs a load, the process is very slow or may even fail. Using Business Unit, we have the option to 'Allow Dirty Reads' to allow TX to read tables without locking them. This option lacks in ODX Server. Currently we have to add all tables as a Managed Query, adding the WITH NOLOCK statement. Ticket #18989.
There currently is not any built-in functionality to execute a process on failure of an execution package. In fact, one of our clients has had to build an external process to watch the log table in the repository database to look for a failure and then execute other processes. I think a simple solution to this would be to allow for Running a Package upon Failure. That execution package could execute stored procs, outside SSIS packages, etc which would allow for various types of interaction, including sending short SMS Text Messages of a failure, creating a trouble ticket in an IT Management System, etc.
Object-level security was introduced in the launch of Tabular 2017. This feature allows you to hide tables and columns from the end user. The ability to hide columns and tables is vital if you need to do something like allow a salesperson to see their total sales, but hide the cost of the sale, which would also exist in the sales fact. Another critical example would be to restrict access to tables or columns that contain PII to managers or HR. In fact, I do not know how you can make a Tabular model GDPR or CCPA compliant without this feature, unless you exclude all PII entirely. There are hacky workarounds to get a similar effect with row-based security, but they only work with simple, small models. I think anyone who's building a serious enterprise solution with stringent compliance requirements is going to really want this feature.
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK