Let’s start by stating I’m not a SQL DBA đŸ˜‰ when it comes to databases, I’m just a user who needs a database for my applications đŸ™‚ Lately, in various environments, we’ve been creating different Ivanti Workspace Control and Automation Manager databases. We simply request an empty database from the SQL department and are assigned DBO rights. Subsequently, we establish a connection to the database from Ivanti Automation Manager (for example) and handle the initial database setup within the application itself.

After initializing the SQL database, the user who performed the initialization can start Ivanti Workspace Control, for example, without encountering any issues. Another user, who has the same permissions (DBO) on the database, is unable to launch the application. The second user is able to access the SQL Database using SQL Management Studio, where they seem to have full control over the database.

While comparing a ‘working database’ with the ‘non-working’ one, we noticed a distinction in SQL table names. In the working database, all tables were named like dbo.table1, dbo.table2, and so forth. In the non-working SQL Database, we observed that all table names began with the username of the individual who initially initialized the database. For example, within the non-functional SQL Database, we observed [Mydomain\Username].table1, etc.

After some investigation, we found out there were several reasons why this might happen. Here’s an overview of the main causes:

1. Default Schema of the User

In SQL Server, each database user can be assigned a default schema. If a user does not explicitly specify a schema when creating a table, the table will be created in their default schema. If the default schema is not set to dbo (which is the default schema for most users), the tables may end up in another schema that includes the domain and username.

Solution: Set the user’s default schema to dbo:

ALTER USER [domain\username] WITH DEFAULT_SCHEMA = dbo;

2. Explicit Schema Specification in SQL Statements

If the SQL statements being executed to create tables include the schema explicitly as domain\username, the tables will be created under that schema.

Solution: Ensure that the SQL statements use dbo or omit the schema to use the default schema:

CREATE TABLE dbo.TableName (Column1 INT, Column2 NVARCHAR(50));

3. Incorrect Connection Context

Sometimes, the connection context or the tool used to run the SQL scripts might be causing the tables to be created with the domain\username schema. This can happen if the tool implicitly uses the connected user’s name as the schema.

Solution:

  • Check the connection settings and the context in which the scripts are being executed.
  • Make sure that the connection is not enforcing a different schema context.

4. Lack of Permissions

If the user creating the tables does not have the necessary permissions on the dbo schema, SQL Server might default to using a schema that the user does have permissions on, which could be their own domain\username schema.

Solution: Grant the necessary permissions to the user on the dbo schema:

GRANT ALTER ON SCHEMA::dbo TO [domain\username];

In our case, the SQL tables were owned by the initial user rather than dbo, requiring us to revert the ownership back to dbo. We ran a SQL query, replacing ‘DOMAIN\USERNAME’ with the current table names, which automatically generated a script to transfer all tables back to dbo ownership.

SELECT 'ALTER SCHEMA dbo TRANSFER [' + s.Name + '].' + o.Name
FROM sys.Objects o
INNER JOIN sys.Schemas s on o.schema_id = s.schema_id
WHERE s.Name = 'DOMAIN\USERNAME'
And (o.Type = 'U' Or o.Type = 'P' Or o.Type = 'V')

The query doesn’t make any changes immediately; it only generates a SQL script, which can then be used to actually change the tables. The sample output of the query-generated SQL script appears as follows:

ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblAudits
ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblSettings
ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblObjects
ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblResources
ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblFiles
ALTER SCHEMA dbo TRANSFER [DOMAIN\USERNAME].tblFolders

Before proceeding with the table changes, ensure you have a valid SQL backup of the database. This step is essential to safeguard against any potential data loss or unforeseen issues.

This script contains a series of ALTER SCHEMA statements transferring ownership of each table from [DOMAIN\USERNAME] to dbo. Copy the script output generated by the query and execute it on the incorrect database. This action will transfer ownership of all tables from [DOMAIN\USERNAME] to dbo.

Now that we have ‘fixed’ the database tables, let’s ensure this doesn’t happen again by modifying the current SQL configuration to prevent new tables from being created incorrectly.

1. Set Default Schema for Users

Ensure that all new and existing database users have their default schema set to dbo. This can be done when creating the user or later if the user already exists.

Setting Default Schema When Creating a User:

CREATE USER [domain\username] FOR LOGIN [domain\username] WITH DEFAULT_SCHEMA = dbo;

Altering Default Schema for Existing Users:

ALTER USER [domain\username] WITH DEFAULT_SCHEMA = dbo;

2. Create Database Roles with Specific Permissions

Instead of assigning permissions directly to individual users, create roles with the necessary permissions and set the default schema for these roles to dbo.

Creating a Role with Default Schema:

CREATE ROLE [RoleName] AUTHORIZATION [dbo];
ALTER ROLE [RoleName] ADD MEMBER [domain\username];

Hopefully, this article prevents you from encountering the same issue we experienced đŸ™‚

We manage a Citrix farm where users primarily launch a full desktop environment. From there, they can also connect to other applications running in Citrix Silo’s or access external Citrix farms. As an user environment manager (UEM), we utilize Ivanti Workspace Control (IWC).

When a user logs onto the primary desktop, the endpoint hostname is utilized by Ivanti Workspace Control within that session. Based on the endpoint hostname, we can set specific configurations using features like “location and devices”. In a double-hop scenario, where a user launches a Citrix published application or another Citrix desktop from within the primary session, the hostname of the primary session server is used as the hostname in the secondary session.

Read More →

For managing several environments, we utilize Ivanti Automation Manager, leveraging Microsoft SQL Server as the database. According to the documentation, Ivanti Automation Manager does not support “SQL Server Always On availability groups,” and unfortunately, there is no mention of using a “SQL Server multi-subnet failover cluster.”

Supported database systems https://help.ivanti.com/res/help/en_US/IA/2024/Admin/Content/48735.htm

Within our environments, however, the use of a “SQL Server multi-subnet failover cluster” is the standard database configuration that we must use. Simply by adding the parameter “MultiSubnetFailover=True” to the database connection string, the SQL Client recognizes that it’s a MultiSubnetFailover cluster. However, since the database connection string is initiated by Ivanti Automation Manager, we don’t have the ability to add “MultiSubnetFailover=True” to it directly. This parameter will need to be included from within the Ivanti Automation Manager software.

SqlConnection.ConnectionString Property https://learn.microsoft.com/en-us/dotnet/api/system.data.sqlclient.sqlconnection.connectionstring

Upon inquiry with Ivanti, it was indeed confirmed that there is no support for a “SQL Server multi-subnet failover cluster.” The request from Ivanti was to submit a Uservoice through the Ivanti Ideas Portal for this feature. We have duly submitted the request as requested. However, for unclear reasons, Ivanti has chosen not to implement this feature.

Uservoice: MultiSubnetFailover support (Microsoft OLE DB Driver for SQL Server) https://ivanti.ideas.aha.io/ideas/IA-I-44

MultiSubnetFailover Uservoice

Without the “MultiSubnetFailover=True” value in the connection string, for example, Ivanti Automation Manager may fail to start after the active SQL node is changed.

Connection error

Since we couldn’t avoid using a SQL Server multi-subnet failover cluster, we have temporarily resolved this by implementing a script. It may not be the most elegant solution, but it gets the job done!

We have created a scheduled task on all servers where the Ivanti Automation Manager Console and Ivanti Dispatchers are installed. This task runs every 5 minutes and executes a PowerShell script, which checks if the connection to the database is still possible. If not, it identifies the active SQL node and updates the hosts file accordingly, allowing the Consoles and Dispatchers to establish a connection with the database again.

<#
.SYNOPSIS
This PowerShell script updates the hosts file on a target machine with the current active SQL node IP address.
It checks if the specified target hostname is reachable. If not, it determines the active SQL node and updates the hosts file accordingly.

.DESCRIPTION
This script is designed to be run on a target machine to ensure that it always resolves a specific hostname to the active SQL node IP address.
It checks the availability of the target hostname and updates the hosts file with the IP address of the active SQL node if necessary.

.NOTES
- Script Name: Update-HostsFile.ps1
- Version: 1.0
- Authors: Rink Spies
- Date: 08-04-2024

.PARAMETER None
This script does not accept any parameters.

.EXAMPLE
.\Update-HostsFile.ps1
This command runs the script to update the hosts file with the current active SQL node IP address.

#>

# VARIABLES
$HostsFile = "$env:SystemRoot\System32\drivers\etc\hosts"
$TargetHostname = "MySqlServerName" # <<Update with SQL Server Instance name >>
$SQLNodes = @("1.2.3.4", "2.3.4.5", "3.4.5.6")  # << update with all SQL Node IP's >>
$LogFile = "C:\Windows\Temp\Update-hosts-file.log"

# FUNCTIONS

# Add-HostRecord function adds a record to the hosts file.
function Add-HostRecord {
    param(
        [string]$HostsFilePath,
        [string]$IP,
        [string]$Hostname
    )

    Add-Content -Path $HostsFilePath -Value "$IP`t`t$Hostname"
}

# Test-ActiveSQLNode function checks if a given SQL node is active.
function Test-ActiveSQLNode {
    param(
        [string]$SQLNode
    )

    return (Test-NetConnection -ComputerName $SQLNode -Port 1433 -InformationLevel Quiet -ErrorAction SilentlyContinue)
}

# Update-HostsFile function updates the hosts file with the IP address of the active SQL node.
function Update-HostsFile {
    foreach ($Node in $SQLNodes) {
        if (Test-ActiveSQLNode $Node) {
            Add-HostRecord -HostsFilePath $HostsFile -IP $Node -Hostname $TargetHostname
            return $Node
        }
    }
    return $null
}

# Log-Output function logs messages to the console and a log file.
function Log-Output {
    param(
        [string]$Message,
        [bool]$IncludeTimestamp = $true
    )

    $logEntry = if ($IncludeTimestamp) {
        "$(Get-Date -Format 'dd-MM-yyyy HH:mm:ss') $Message"
    } else {
        $Message
    }

    Write-Output $logEntry
    Add-Content -Path $LogFile -Value $logEntry
}

# SCRIPT

# Start the script
Log-Output "#############################################"
Log-Output "Starting update hosts file script."

# Check if the current IP for the target hostname is active
if (-not (Test-ActiveSQLNode $TargetHostname)) {
    Log-Output "Current IP for $TargetHostname is not active anymore."
    $activeNode = Update-HostsFile
    if ($activeNode) {
        Log-Output "Active IP $activeNode is online and configured in the hosts file."
    } else {
        Log-Output "None of the IPs are active."
    }
} else {
    Log-Output "Current IP for $TargetHostname is still active."
}

# End the script
Log-Output "Stopping update hosts file script."

As mentioned, not really the solution you’d ideally want to use, but hopefully Ivanti Automation Manager will still receive support for MultiSubnetFailover in the future.

Recently, I worked on a project where the workload needed to shift from using a Citrix Published Desktop to a physical laptop, with locally installed applications. As always, there are applications that, for various reasons, cannot be moved from the Citrix Published Desktop to the physical laptop. For these applications, we chose to offer them as Citrix Published Applications. Although this transition went well technically, end users reported that working with published applications was not considered very pleasant.

Scenario: The published apps were offered from a Citrix Virtual Apps en Desktops Farm, utilizing Ivanti Workspace Control. Ivanti Workspace Control is a workspace management solution provided by Ivanti, a company specializing in IT management software. It offers features for managing user workspaces across various devices and environments, including physical desktops, virtual desktop infrastructure (VDI), and application virtualization platforms. Unfortunately, Ivanti has announced that Ivanti Workspace Control will reach end of life on December 31, 2026, but at the moment, we are using it to our full satisfaction. When starting a Citrix Published Application, it takes some time due to, among other factors, the loading of the Windows profile and Ivanti Workspace Control settings before the application actually starts. When you subsequently start a second published application, it loads faster since the entire profile and UEM (User Environment Management) don’t need to be processed again. When you close the last Citrix Published Application, it also logs out the entire user session, resulting in the next Citrix Published application taking some time again, as your entire Citrix sessions needs to be loaded

Read More →

After we recently upgraded our environment to Ivanti Workspace Control 10.12.0.0, we suddenly encountered issues with Citrix Published Applications. Users trying to launch a published application received an error “Application can’t be started…(Instant Passthru could not be resolved)

Ivanti Workspace Control

Although we had not made any changes to the Citrix XenApp Publishing integration within the Ivanti Console, we did check the configuration completely. Everything was configured as Ivanti says it should be set up.

Cause the update of our last golden image only updated Ivanti Workspace Control, we decided to downgrade the agent from 10.12.0.0 to 10.11.10.0, the version we were previously using. And yes, suddenly everything was working properly again.

Read More →

Life Cycle management occasionally causes SQL servers to be replaced and databases to be moved to new servers. As a result, the Ivanti WorkSpace Control datastores have to be moved. Now this does not seem very complex at first, but if the environment uses a primary and secondary datastore, you have to deal with some extra challenges. The IWC datastore contains three components

  1. Configuration and state
  2. Logging
  3. Usage Tracking

Out of the box the data is stored in the primary datastore, but the logging and usage tracking data can also be stored in the secondary datastore. This ensures that the configuration datastore does not explode.

Ivanti has a nice support article in which they exactly describe how you can move both datastores to a new database server (SQL in my case). Basically, you merge the secondary and primary datastore again, migrate the primary datastore to the new database server and split the logging and usage tracking from the primary to the secondary datastore.

HOWTO: Migrate database in Ivanti Workspace Control

This all sounds very simple, but what if your secondary datastore is 600GB in size? Putting these together will take forever, not to mention other challenges. Consider, for example, the sizing of the primary datastore, can it store so much extra data? In this case you actually want to migrate the primary and secondary datastore to the new SQL environment and simply change the database connection string within the Ivanti WorkSpace Control console. Although there is no support article how this can be accomplished, luckily there is a way to do this in a supported way!

Read More →
RelayServer

During regular maintenance at a customer we noticed the Ivanti WorkSpace Control logging database was getting quite big. The logging database had reached a size of more than 1TB, something of which the cause was not immediately clear. Sure they had lots of users and were keeping lots of auditing data, but the increase in database size couldn’t be related to additional users or something else.

We contacted Ivanti Support to investigate the huge increase in logging database size. They told me about a useful tool called “Workspace Manager Logging Management Tool“. The tool was created by a former employee Patrick van Grinsven. Soon it became clear what exactly was using so much space within the logging database.

Read More →

Prior to Windows 10 (build 1607) Sticky Notes was a “Desktop App”, for which it was quite easy to roam all user settings and notes. But since the Windows 10 anniversary update Sticky Notes is available as a “Windows App” (Universal App). This creates a new challenge.

We have to make sure that Sticky Notes settings and notes, which a created by the users are being roamed. In case that roaming profiles are being used, this won’t be very challenging, because the whole user profile will be stored soon as an user logs off. However when local or mandatory profiles are used, in combination with a Zero Profile technology, like the technology offered by Ivanti Workspace Control (formerly RES Workspace Manager), some challenges lie ahead.

Read More →