Try ARC6: towards distributed computing in a few minutes

Scared of distributed computing complexities?

With ARC6 you can setup a Computing Element and try common distributed computing workflows in just a few minutes!

ARC6 comes with so-called zero configuration included and works out of the box without any configuration at all.

Step 1. Enable NorduGrid ARC6 repos

Prepare your system to install via the NorduGrid Repositories.

Note

Alpha and release-candidate packages are in testing repository, so please make sure it is enabled, e.g. on RHEL-based systems you can use yum --enablerepo=nordugrid-testing to enable it for one transaction or yum-config-manager --enable nordugrid-testing to enable permanently.

If you want to test ARC6 including all latest developments, set up your repository to include the nightly builds following Using ARC packages from nightly builds instructions.

Or for the latest development of the not yet released ARC 7: Nightly Next Builds

Step 2. Install A-REX

ARC Resource-coupled EXecution service (A-REX) is a core component that manages authentication, authorization and job life cycle. It is enough to have A-REX installed to have a minimal computing element:

[root ~]# yum -y install nordugrid-arc-arex

Step 3. Run A-REX

To start ARC services just run:

[root ~]# arcctl service start --as-configured

You can check if A-REX is running with:

[root ~]# arcctl service list
arc-acix-index                   (Not installed, Disabled, Stopped)
arc-acix-scanner                 (Not installed, Disabled, Stopped)
arc-arex                         (Installed, Enabled, Running)
arc-datadelivery-service         (Not installed, Disabled, Stopped)
arc-gridftpd                     (Not installed, Disabled, Stopped)
arc-infosys-ldap                 (Not installed, Disabled, Stopped)

Note

arcctl tool automates many ARC CE operations and is designed with bash-completion in mind. If you would like to use ARC in production it is advised to have completion enabled:

[root ~]# yum install bash-completion python-argcomplete
[root ~]# activate-global-python-argcomplete

Step 4. Generate user certificate and key for testing

Grid services and users authentication heavily relies on cryptography and uses certificates/keys for each entity. ARC6 comes with Test Certificate Authority on board that can issue the test user certificates easily.

The ARC6 zero configuration implements a default closed approach defining the special authorization object called authgroup.

During the test-user certificate generation, arcctl test-ca will automatically add the issued certificate subject to the testCA.allowed-subjects file, opening the job submission possiblity to the test-user transparently. the testCA.allowed-subjects can be found in your /etc/grid-security folder.

No other subject will be able to submit to your system before you change the authgroup settings in arc.conf.

You can test submission from the host running A-REX or from any other host in the network.

Testing from the host running A-REX

It is technically possible to submit jobs from the root account, however it is advised to use a dedicated regular user. Here we assume that you use a dedicated regular user.

To generate test certificate/key and install it to standard location inside local user’s home directory run:

[root ~]# arcctl test-ca usercert --install-user user01
User certificate and key are installed to default /home/user01/.globus location for user user01.

Testing from any other host

In order to submit jobs from any other host (not the one running A-REX) you need to transfer the (test) user certificate and the CA-files to this other host.

On the A-REX host generate a user certificate/key:

[root ~]# arcctl test-ca usercert --export-tar
User certificate and key are exported to testcert-09160712.tar.gz.
To use it with arc* tools on the other machine, copy the tarball and run the following commands:
  tar xzf testcert-09160712.tar.gz
  source arc-test-certs/setenv.sh

Transfer the tarball to the client host and on the client host execute the commands suggested in the arcctl output:

[user ~]$ tar xzf /tmp/testcert-09160712.tar.gz
[user ~]$ source arc-test-certs/setenv.sh

Note

The zero configured A-REX comes with EMI-ES and REST interfaces enabled. It runs on port 443, so make sure it is not firewalled to be able to be used from another client host.

Step 5. Submit job and check it is running

Install ARC client tools on the client host:

[root ~]# yum -y install nordugrid-arc-client

To submit a job, or perform any other action towards the ARC server you need a so-called proxy-certificate which is a Single Sign-On token for distributed grid-infrastructure. It is generated in the following way:

[user ~]$ arcproxy
Your identity: /DC=org/DC=nordugrid/DC=ARC/O=TestCA/CN=Test Cert 08272152
Proxy generation succeeded
Your proxy is valid until: 2018-08-28 09:54:24

You can start with the information query about your newly installed ARC computing element [1]:

[user ~]$ arcinfo -c https://arc.example.org/arex
Computing service:
  Information endpoint: https://arc.example.org:443/arex
  Information endpoint: https://arc.example.org:443/arex
  Information endpoint: https://arc.example.org:443/arex
  Submission endpoint: https://arc.example.org:443/arex (status: ok, interface: org.ogf.glue.emies.activitycreation)
  Submission endpoint: https://arc.example.org:443/arex (status: ok, interface: org.ogf.bes)
  Submission endpoint: https://arc.example.org:443/arex (status: ok, interface: org.nordugrid.arcrest)
[1]Examples uses arc.example.org as a domain name for A-REX host

A simple job can be submitted with the arctest tool:

[user ~]$ arctest -J 2 -c https://arc.example.org/arex
Submitting test-job 2:
&( executable = "/usr/bin/env" )( stdout = "stdout" )( stderr = "stdout" )( gmlog = "gmlog" )( jobname = "arctest2" )( clientxrsl = "&( executable = ""/usr/bin/env"" )( jobname = ""arctest2"" )( stdout = ""stdout"" )( join = ""yes"" )( gmlog = ""gmlog"" )" )
Client version: nordugrid-arc-20180822231219
Test submitted with jobid: https://arc.example.org:443/arex/NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm

The job status can be checked with the arcstat tool:

[user ~]$ arcstat https://arc.example.org:443/arex/NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
Job: https://arc.example.org:443/arex/NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
 Name: arctest2
 State: Running

Status of 1 jobs was queried, 1 jobs returned information

To fetch the job’s stdout run arccat tool:

[user ~]$ arccat https://arc.example.org:443/arex/NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
GRIDMAP=/dev/null
HOSTNAME=arc.zero
TMPDIR=/tmp
GLOBUS_LOCATION=/usr
<output omitted>

Step 6. Play more with the ARC Computing Element

As an admin you might frequently need to extract information from the logs and directories that ARC computing element uses. The brief list of the relevant paths can be obtained from:

[root ~]# arcctl config brief
ARC Storage Areas:
    Control directory:
        /var/spool/arc/jobstatus
    Session directories:
        /var/spool/arc/sessiondir
    Scratch directory on Worker Node:
        Not configured
    Additional user-defined RTE directories:
        Not configured
ARC Log Files:
    A-REX Service log:
        /var/log/arc/arex.log
    A-REX Jobs log:
        /var/log/arc/arex-jobs.log
    A-REX Helpers log:
        /var/log/arc/job.helper.errors
    A-REX WS Interface log:
        /var/log/arc/ws-interface.log
    Infosys Infoproviders log:
        /var/log/arc/infoprovider.log

To get information and manage jobs on A-REX server, the arcctl job is useful. Operations include but is not limited to:

  • Listing jobs:
[root ~]# arcctl job list
NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
zb0LDm7RfEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmDBFKDme1CYXm
<output omitted>

[root ~]# arcctl job list --long
NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm      FINISHED arctest2                        /DC=org/DC=nordugrid/DC=ARC/O=TestCA/CN=Test Cert 08272152
zb0LDm7RfEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmDBFKDme1CYXm      FINISHED arctest2                        /DC=org/DC=nordugrid/DC=ARC/O=TestCA/CN=Test Cert 08272234
<output omitted>
  • Job general information:
[root ~]# arcctl job info NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
Name         : arctest2
Owner        : /DC=org/DC=nordugrid/DC=ARC/O=TestCA/CN=Test Cert 08272152
State        : FINISHED
LRMS ID      : 16890
Modified     : 2018-08-27 22:06:34
  • Job log:
[root ~]# arcctl job log NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm
2018-08-27T22:03:34Z Job state change UNDEFINED -> ACCEPTED   Reason: (Re)Accepting new job
2018-08-27T22:03:34Z Job state change ACCEPTED -> PREPARING   Reason: Starting job processing
2018-08-27T22:03:34Z Job state change PREPARING -> SUBMIT   Reason: Pre-staging finished, passing job to LRMS
2018-08-27T22:03:36Z Job state change SUBMIT -> INLRMS   Reason: Job is passed to LRMS
2018-08-27T22:06:34Z Job state change INLRMS -> FINISHING   Reason: Job finished executing in LRMS
2018-08-27T22:06:34Z Job state change FINISHING -> FINISHED   Reason: Stage-out finished.
  • A-REX logs that mentions the job:
[root ~]# arcctl job log NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm --service
### /var/log/arc/arex.log:
[2018-08-27 22:03:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: ACCEPTED: parsing job description
[2018-08-27 22:03:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: ACCEPTED: moving to PREPARING
[2018-08-27 22:03:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: PREPARING from ACCEPTED
[2018-08-27 22:03:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: SUBMIT from PREPARING
[2018-08-27 22:03:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: state SUBMIT: starting child: /usr/share/arc/submit-fork-job
[2018-08-27 22:03:36] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: state SUBMIT: child exited with code 0
[2018-08-27 22:03:36] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: INLRMS from SUBMIT
[2018-08-27 22:06:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: Job finished
[2018-08-27 22:06:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: FINISHING from INLRMS
[2018-08-27 22:06:34] [Arc] [INFO] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: State: FINISHED from FINISHING
[2018-08-27 22:06:34] [Arc] [ERROR] [16298/4] NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm: Job monitoring stop requested with 1 active references
### /var/log/arc/ws-interface.log:
[2018-08-27 22:05:32] [Arc.A-REX] [INFO] [16298/42] GET: id NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm path stdout
[2018-08-27 22:05:49] [Arc.A-REX] [INFO] [16298/43] GET: id NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm path stdout
  • Getting job attributes:
[root ~]# arcctl job attr NhlKDmsmeEtnPSAtDmVmuSEmABFKDmABFKDm2PJKDmBBFKDmxDyQbm jobname
arctest2

Now you are ready to Install production ARC6 Computing Element!