The console is used to import folders into your solution. It updates the sequence numbers for each table and creates the Folder record.

There are two import functions in the SAFE X3 V2 configuration console, the first function is the import of a folder from a a backup (file copy) with an SVG data directory containing the flat data to be imported. The console folder export function makes it possible to perform this operation in a simple way. The other function is used to recopy a folder directly from another online solution, by means of an assistant. Therefore it is no longer necessary to go through manual backup/export/restoration phases to copy the folder of a solution to another.

The various options and possibilities offered by the import functions of the console make it possible to meet a wide range of requirements in terms of folder copy/restoration/migration.

Export of a folder

The folder export function carries out the creation of a SVG directory containing the flat data of a folder. The goal is to make possible the later import of the folder into another solution by means of the folder import function.

To export a folder, launch the SAFE X3 configuration console and click on the Export button in the Folders tab of your solution after choosing the folder to export.

In the dialog box"Parameter entry" choose the directory containing the extracted flat data. The console suggests SVG by default.

Import of a backed up folder with SVG flat data directory

The following methodology is based on the fact that the folder to be imported has been extracted and copied in the directory structure of your application folders. The tables have been flat exported to a directory using the table export function of the X3 application or the export function of the console.

  • Copy of the directory of the folder to be imported in [ApplicationServerPath]\FOLDERS\
  • Copy of the X3_PUB directory of the folder to be imported in [ApplicationServerPath]\FOLDERS\X3_PUB\

Business Intelligence: Caution, when importing a demonstration folder provided on the product DVD, if the Business Intelligence functionalities need to be operational in the folder at the end of the import, the solution MUST have been published on the Business Objects server prior to the folder import.

Warning:
The console can only import the folders stored in the same volume as the root folder (reference to the ADXVOLUMES).
The console does not take into account the "Actual size in srf" option available during the extraction of the folders.To benefit from this optimization, import your folder via the SAFE X3 client.

To import a folder, launch the  SAFE X3  configuration console and click on the Import button in the Folders tab of your solution.

In the dialog box "Import a folder" choose the folder to be imported and the sub-directory that contains the extracted flat data. The console proposes automatically SVG if it is present.

Importing a folder to an Oracle database

In the import of the folder to an Oracle database, the console is used to specify the size of the tablespaces.

Importing the folder to an SQL Server database

In the import of the folder to an SQL Server database, the console is used to specify that the folder import will be carried out in a new group of files:

  • Use the groups of files (checkbox selected) : the folder import is going to create two files Sql Server:

    • [DataBaseName]_[FolderName]_DAT.ndf to store the data.
    • [DataBaseName]_[FolderName]_IDX.ndf to store the indexes.


    The use of file groups is advised for improved performances. In this configuration, it is possible to specify the initial sizes of the data and index files.

  • Do not use the file groups (checkbox not selected) : the folder is carried out in the primary data file:
    • [DataBaseName]_data.mdf.


Import parameters

The console is able to manage some initializations of the imported folder. For that purpose, you need to supply a connection user of your application, the console will then record with this user the newly imported folder record. It is therefore advised to enter the ADMIN user of your application. This user must be valid and be able to log on to the root folder of your solution.

The "Reference folder" and "Copy folder" parameters show the former parameters of the folder that need to be imported. Moreover they enable the selection of those to be used upon creation of the folder record among the the list of folders that already exist in your application.

It is possible to import the archived folder of your folder at the same time as the folder. For that purpose you need to specify the archived folder to be imported in the drop-down menu provided for that purpose if the latter has not been preselected already. If you do not select any archived folder, the folder record will be modified accordingly. You might lose any already existing archived folder! But if you select the checkbox "Transfer only the archived folder" then the data import will only be carried out for the archived folder, which makes it possible for it to be imported after the folder to which it is associated. The SVG directory of your archived folder must have the exact same name as the SVG directory of the folder you wish to import!

The checkbox "Do not import data" allows the total import of the folder structure and any archived folder associated to it. On the other hand, all the tables of the imported folders will be entirely empty. Therefore it will be impossible to log on to either this folder or the associated archived folder. This option is useful in cases of folder imports with large volumes of data, because it is then possible to import the folder structure via the console and to restore the database data via an optimized method in an end-of-import script launched by the console (see init_console in paragraph "to know")

Import of a folder from another solution

To import a folder from another solution, launch the SAFE X3 V2 configuration console and click on the button Remote import  in the Folders tab of your solution, an assistant will guide you through the phases necessary for the copy of your folder from one solution to your current solution. Using this wizard, it is also possible to migrate a folder from an earlier online solution without any additional manual or technical stages.

Nota : in order to use the remote import function with a solution installed on a Unix type server, the "tar" and "gzip" utilities must be installed and available in the PATH.

Nota 2 : during the folder transfer the console will create an "archive" file which will be copied from the source solution to the target solution. By default this archive is compressed in order to optimize the volume/transfer rate ratio but on systems with very wide bandwidth this compression causes an increase of the overall transfer time compared to a simple file copy. See paragraph 'compression rate during the transfer of folders' hereafter.

phase 1 of the remote import wizard

The first phase is used for choosing the type of the source solution of your folder. It is possible to choose a V130 type solution or a higher-version solution managed by your console. The V130-type solutions are not administered by the console and are not available by means of an adxadmin administration engine. So the connection method to the application and the selection of the folders to be migrated is different from a solution administered by the console.

phase 2 of the remote import wizard

The content of this phase is dependent on the type of solution chosen during phase 1.

In the case of an online V130 folder to be migrated, you are requested to provide the connection information to the application. A V130 process server is able to run some actions in the same way as an administration engine, this is the reason why the connection screen is identical. You need to stipulate the server name, listening port of the V130 process server and a system connection account to the host machine of your V130 process server An application connection account is not necessary because the console will never need to connect to the V130 source application.

In case of a source solution of type 'administered by the console', you are requested to choose a solution among the list of solutions administered by your console instance. It is not possible to choose the same solution for the source and target solution.

phase 3 of the remote import wizard

This screen is used to select the online folder to be imported into your current solution from the online source solution. Only the folders from volume A are listed.

phase 4 of the remote import wizard

This screen gives the opportunity to choose whether the archived folder should be imported at the same time as the online folder. In the event of a solution administered by the SAFE X3 V2 configuration Console, the archived folder is detected automatically.

In the case of a V130 type source solution, the archived folder must be detected. For that purpose, it is necessary to enter the password of the pre-assigned database administrator and then to click on the "Detection" button. An SQL script is run by the V130 process server on the source server used to specify the name and location of the archived folder. In case of a failure, a message provides the necessary information relating to the non-detection.

phase 5 of the remote import wizard

This screen is used to define the export options of the source folder to be migrated. Two options are possible based on the compatibility and required performances.

The first mode uses the neutral flat file format in an SVG type directory whose name shall be specified. This format enables lesser performances but it makes it possible to migrate the folder of a solution using an SQL Server base to Oracle and vice-versa. This format is not adapted to very large volumes but it is adapted to a migration with database type change. It is possible to check the box "Do not export the data", which would only export the folder structure with all the tables empty. The data can be imported via a customized script launched at the end of the import by the Sage X3 configuration console and by means of optimized external tools.

The second mode uses the proprietary export tool of the source database for the data as well as an SVG format with empty tables for the folder structure. This mode is adapted to larger volumes but it limits the target database to the same technical platform as the source database. In effect, an Oracle export can only be read to feed an Oracle database and an SQL Server export can only feed an SQL Server database because the tools being used are 'exp' for Oracle and 'bcp' for SQL Server. As a consequence, the Oracle 7 and 8 databases are not supported. The console does not manage external tools enabling the conversion of technical database platforms. The advised platforms for this mode of operation are Oracle 9i and 10g and SQLServer 2000 et 2005. If this mode is chosen, the database export will automatically be restored in the target database using the same options and tools.

In any case, the password of the database administrator must be entered in this screen because it is ecessary for various tasks like the sequence export.

phase 6 of the remote import wizard

This screen contains the same options as the saved folder standard import screen. Please refer to the explanations about the creation option for file groups, tablespace sizes, reference and copy folders, etc.

An additional field can be used to rename the folder during its transfer. This field cannot be accessed in some situations where the Oracle proprietary export option is used (exp and imp utilities are used).

phase 7 of the remote import wizard

A standard progress window shows the progress of the folder migration phases. In case of errors, you should look at the log files in order to knwo why the progress has failed. The folders are extracted to a compressed archive that is transferred to the target server. From here, the archive reconstructs the folders in volume A, then, the standard folder import is launched with the options defined in the wizard.

Migration of a folder from a previous version or a lesser patch level

Once the folder import in the console is over, you have to:

  • imperatively revalidate your imported file from the main folder of your solution, and
  • save the folder record in the Folder function, then validate it. (see the migration method for more details).

Once these actions have been completed, your folder will have been migrated to the last online version and it will be possible to log on to your folder.

SEEWARNINGDo not import a folder of a higher version or with a patch level higher than the main folder of your solution.

To run a customized user script at the end of folder import.

At the end of the import, it is possible to automatically run a user script. For instance, this script is used for integrating data into the database via customized and optimized commands after importing the structure of a folder in the normal process, via the configuration console, but with empty tables. Using such a user script is particularly recommended to copy folders having an important data volume and to benefit from the best performances while using the optimized tools from the target architecture.

In all import cases, remote or not, the console will detect the presence of a file init_console[.cmd|.sh] in the directory of the folder to be imported and will execute this command file if it is present. The .cmd ou .sh extension is determined according to the platform on which the folder is imported. The console is sensitive to the return error codes and to the standard strerr output. If an error is detected, a warning is issued during the import phases and the standard strerr et strout outputs are displayed in the import log file.

Here is an example of customized Unix/Oracle script particularly useful in case of remote import or folder migration because it makes it possible to copy data from one base to another, directly via a link network and, consequently, to use the powerful Oracle datapump tools to manage the data.

this script is an example used to explain the mechanism of customized scripts at the end of an import, it does not constitute any reference and any deployed script must be deployed subject to the architecture and the target needs.

#!/bin/bash

ORACLE_HOME=/opt/oracle/product/11.2.0.1
ORACLE_SID=ORCL

NEWFOLDER=NEWFOLDER
OLDFOLER=OLDFOLDER

export ORACLE_HOME ORACLE_SID NEWFOLDER OLDFOLDER

###############################################
# Drop all folder sequences and truncate tables
$ORACLE_HOME/bin/sqlplus -L NEWFOLDER/tiger << EOF

set serveroutput on

BEGIN
FOR s IN (SELECT SEQUENCE_NAME FROM USER_SEQUENCES WHERE SEQUENCE_NAME LIKE 'SEQ_%') LOOP
dbms_output.put_line ('Processing table ' || SUBSTR(s.SEQUENCE_NAME,5));
EXECUTE IMMEDIATE ' DROP SEQUENCE ' || s.SEQUENCE_NAME ;
EXECUTE IMMEDIATE ' TRUNCATE TABLE ' || SUBSTR(s.SEQUENCE_NAME,5);
END LOOP;
END;
/

EOF


###############################################
# import data with datapump and dblink
$ORACLE_HOME/bin/impdp system/manager DIRECTORY=dmpdir SCHEMAS=$OLDFOLDER NETWORK_LINK=OLDDB REMAP_SCHEMA=$OLDFOLDER:$NEWFOLDER INCLUDE=TABLE_DATA,SEQUENCE TABLE_EXISTS_ACTION=REPLACE

In this Unix script example, we consider that the option 'Import table structure only' has been used. That is why the first part of the script consists in "preparing" the folder by suppressing all the sequences that were initialized with a minimal value and not with the real value as a result of the previous option. Then, just in case the script were run several times, all the tables are carefully emptied before effective import of datas.

Finally, in the last part of the script, the Oracle datapump is used via a link network to copy datas and sequences directly from a base to another. It is not necessary to copy other datas since the configuration console has already created the complete folder structure, the objects and the rights.

Don't forget!

You encounter an error of type "User SYS cannot connect to database: SP2-0640: Not connected".

In order to solve the most common problems upon configuration of the databases, consult the field "To know" by clicking here...

You encounter an error of type "cmd538647161.sh: line 3: gzip: command not found" when you use the remote import function.

You are trying to transfer a folder from or to a solution installed on a Unix server but the 'tar' and 'gzip' utilities are not installed or available in the PATH.

After the import of a folder via the console, the folder does not appear in the solution Folders tab.

The console could not complete the creation of the folder record. Consult the log file in order to determine why the folder record creation was interrupted. So that the imported folder is displayed in the solution folders list, it is necessary either to save the record for your imported folder from the parent folder in the case of the duplication of a folder or revalidate the imported folder from the parent folder in the case of a migration of a folder from a previous version.

Beware that the import of very large folders via the console can take a long time (more than 2 hours for valfil).

When the console launches an action on the server, it waits for a response with a maximum timeout (by default 720000 milliseconds that is 2 hours). It is possible to increase this timeout in the console preferences. Solution tab, field config.solution.comm.timeout.long.
Increase this value if the 2 hour timeout is insufficient and blocks the importation of your folder. The field must be entered in milliseconds. It is possible to add the field to the list of values if it is not already present in this list.

Error "AINSTCONS : the process does not exist in the archive".

At the end of the import, the console logs on to the application via the root folder in order to initialize the newly imported folder. Once it is connected, the console launches the INIT_CONSOLE function of the AINSTCONS process. If the message "AINSTCONS: the process does not exist in the archive" appears, you need to update your application by carrying out the available patches and restart the import.

Compression rate during the transfer of folders

During the folder transfer, the console will create an "archive" file which will be copied from the source solution to the target solution. By default this archive is compressed in order to optimize the volume/transfer rate ratio but on systems with very wide bandwidth this compression causes an increase of the overall transfer time compared to a simple file copy.

It is possible to control the compression mechanism in the Safe X3 Configuration Console in order to improve the folder transfer time in the case where servers are connected through systems with very wide bandwidth.

The parameter "config.solution.configuration.nocompression" must be added to the Safe X3 Configuration Console preferences in the "solution" tab. If the value is set to "true" then:

- in Windows, the 7Zip compression tool will receive the option -mx0 (store only)

- in Unix, the gzip compression tool will be used with the compression level -1 (minimum compression)

If you wish to check the level of compression rather than using the "nocompression" option, then add the parameter "config.solution.configuration.compressionlevel" with a value included in { " 1 ", " 3 ", " 5 ", " 7 ", " 9 " }, any other value will be ignored.

If the option "nocompression" is set to "true" then any compressionlevel value will be ignored.

Finally, if the option "nocompression" is not positioned or set to "true", and the option "compressionlevel" is not set to a common value either, then the level of maximum compression will be used for the 7Zip and gzip tools based on the operating system. This is the default behavior during folder transfers in the Safe X3 Configuration Console.