Posts by author antonio

How to find subversion working copies in my disk

I need to find all the SVN working copies in my hard disk and this is the find command I have executed

$ find / -type d -exec test -e "{}/.svn" \; -prune -print

Fixed problems with EOBS datasets

The EOBS datasets were suffering some problems due name server resolution for the ECA's Opendap server: opendap.nmdc.eu

This has been fixed pointing to the KNMI's opendap server: opendap.knmi.nl

Remember that this dataset is a remote dataset which is been accessed on-demand, therefore some connection issue are expected.

Regards

Antonio

TDS is reporting too many open files

The Thredds data server is reporting too many opened files when many users are accessing concurrently to the opendap service.

This is happenning with the ECOMS data gateway which is publishing virtual datasets (NcML) aggregating a huge number of files. For example, the System4 vistual datasets are a collection of 135k of GRIB files and therefore the potential amount of files for this datasets is 3x the amount files (405k files)

To avoid this the limit on opened files must be increased.

The current limits for a session can be checked as:

[user@oceano ~]# ulimit -Hn
16384
[user@oceano ~]# ulimit -Sn
16384

and for a process the PID is required. For example:

[user@oceano ~]# cat /proc/10051/limits
Limit                     Soft Limit           Hard Limit           Units
Max cpu time              unlimited            unlimited            seconds
Max file size             unlimited            unlimited            bytes
Max data size             unlimited            unlimited            bytes
Max stack size            10485760             unlimited            bytes
Max core file size        0                    unlimited            bytes
Max resident set          unlimited            unlimited            bytes
Max processes             191823               191823               processes
Max open files            16384                16384                files
Max locked memory         32768                32768                bytes
Max address space         unlimited            unlimited            bytes
Max file locks            unlimited            unlimited            locks
Max pending signals       191823               191823               signals
Max msgqueue size         819200               819200               bytes
Max nice priority         0                    0
Max realtime priority     0                    0

To increase the limit have to edit the security limits for the user:

[root@oceano ~]# nano /etc/security/limits.conf

and write down the hard and soft limits. For example:

user            soft    nofile           100000
user            hard    nofile           100000

be sure that system wide limits are also increased:

cat /proc/sys/fs/file-max

or that you can edit sysctl properties and reload them:

[root@oceano ~]# nano /etc/sysctl.conf

adding this line

fs.file-max = 500000

and reload the new propertie value:

[root@oceano ~]# sysctl -p
...

and check that the new value is update:

[root@oceano ~]# cat /proc/sys/fs/file-max
500000

and for the user limits the user needs to start a new session a check the new limits:

[user@oceano ~]# ulimit -Hn
100000
[user@oceano ~]# ulimit -Sn
100000

and also we can check that processes use the same limits

[user@oceano TDS5]$ cat /proc/32090/limits
Limit                     Soft Limit           Hard Limit           Units
Max cpu time              unlimited            unlimited            seconds
Max file size             unlimited            unlimited            bytes
Max data size             unlimited            unlimited            bytes
Max stack size            10485760             unlimited            bytes
Max core file size        0                    unlimited            bytes
Max resident set          unlimited            unlimited            bytes
Max processes             191823               191823               processes
Max open files            1000000              1000000              files
Max locked memory         32768                32768                bytes
Max address space         unlimited            unlimited            bytes
Max file locks            unlimited            unlimited            locks
Max pending signals       191823               191823               signals
Max msgqueue size         819200               819200               bytes
Max nice priority         0                    0
Max realtime priority     0                    0

Using pre-built netCDF-C Libraries for Visual Studio with MinGW

http://www.unidata.ucar.edu/software/netcdf/docs/winbin.html

http://www.mingw.org/wiki/CreateImportLibraries

D:\NetCDF>dlltool -d netcdf.def -D netcdf.dll -l libnetcdf.a
EXPORTS
Cde2h
Cdh2e
NC3__enddef
NC3_abort
NC3_close
NC3_create
NC3_del_att
NC3_get_att
NC3_get_vara
NC3_inq
NC3_inq_att
NC3_inq_attid
NC3_inq_attname
NC3_inq_base_pe
NC3_inq_format
NC3_inq_format_extended
NC3_inq_type
NC3_inq_unlimdim
NC3_open
NC3_put_att
NC3_put_vara
NC3_redef
NC3_rename_att
NC3_set_base_pe
NC3_set_fill
NC3_sync
NC4__enddef
NC4_abort
NC4_close
NC4_create
NC4_def_compound
NC4_def_dim
NC4_def_enum
NC4_def_grp
NC4_def_opaque
NC4_def_var
NC4_def_var_chunking
NC4_def_var_deflate
NC4_def_var_endian
NC4_def_var_fill
NC4_def_var_fletcher32
NC4_def_vlen
NC4_del_att
NC4_get_att
NC4_get_var_chunk_cache
NC4_get_vara
NC4_get_vlen_element
NC4_inq
NC4_inq_att
NC4_inq_attid
NC4_inq_attname
NC4_inq_compound_field
NC4_inq_compound_fieldindex
NC4_inq_dim
NC4_inq_dimid
NC4_inq_dimids
NC4_inq_enum_ident
NC4_inq_enum_member
NC4_inq_grp_full_ncid
NC4_inq_grp_parent
NC4_inq_grpname
NC4_inq_grpname_full
NC4_inq_grps
NC4_inq_ncid
NC4_inq_type
NC4_inq_type_equal
NC4_inq_typeid
NC4_inq_typeids
NC4_inq_unlimdim
NC4_inq_unlimdims
NC4_inq_user_type
NC4_inq_var_all
NC4_inq_varid
NC4_inq_varids
NC4_insert_array_compound
NC4_insert_compound
NC4_insert_enum
NC4_open
NC4_put_att
NC4_put_vara
NC4_put_vlen_element
NC4_redef
NC4_rename_att
NC4_rename_dim
NC4_rename_grp
NC4_rename_var
NC4_set_fill
NC4_set_var_chunk_cache
NC4_sync
NC4_var_par_access
NCD3_close
NCD3_inq_format_extended
NCD3_open
NC_findtestserver
cdChar2Comp
cdParseRelunits
cdRel2Iso
nc__create
nc__create_mp
nc__enddef
nc__open
nc__open_mp
nc__testurl
nc_abort
nc_advise
nc_close
nc_copy_att
nc_copy_var
nc_create
nc_def_compound
nc_def_dim
nc_def_enum
nc_def_grp
nc_def_opaque
nc_def_var
nc_def_var_chunking
nc_def_var_chunking_ints
nc_def_var_deflate
nc_def_var_endian
nc_def_var_fill
nc_def_var_fletcher32
nc_def_vlen
nc_del_att
nc_delete
nc_delete_mp
nc_enddef
nc_free_string
nc_free_vlen
nc_free_vlens
nc_get_att
nc_get_att_double
nc_get_att_float
nc_get_att_int
nc_get_att_long
nc_get_att_longlong
nc_get_att_schar
nc_get_att_short
nc_get_att_string
nc_get_att_text
nc_get_att_ubyte
nc_get_att_uchar
nc_get_att_uint
nc_get_att_ulonglong
nc_get_att_ushort
nc_get_chunk_cache
nc_get_chunk_cache_ints
nc_get_var
nc_get_var1
nc_get_var1_double
nc_get_var1_float
nc_get_var1_int
nc_get_var1_long
nc_get_var1_longlong
nc_get_var1_schar
nc_get_var1_short
nc_get_var1_string
nc_get_var1_text
nc_get_var1_ubyte
nc_get_var1_uchar
nc_get_var1_uint
nc_get_var1_ulonglong
nc_get_var1_ushort
nc_get_var_chunk_cache
nc_get_var_chunk_cache_ints
nc_get_var_double
nc_get_var_float
nc_get_var_int
nc_get_var_long
nc_get_var_longlong
nc_get_var_schar
nc_get_var_short
nc_get_var_string
nc_get_var_text
nc_get_var_ubyte
nc_get_var_uchar
nc_get_var_uint
nc_get_var_ulonglong
nc_get_var_ushort
nc_get_vara
nc_get_vara_double
nc_get_vara_float
nc_get_vara_int
nc_get_vara_long
nc_get_vara_longlong
nc_get_vara_schar
nc_get_vara_short
nc_get_vara_string
nc_get_vara_text
nc_get_vara_ubyte
nc_get_vara_uchar
nc_get_vara_uint
nc_get_vara_ulonglong
nc_get_vara_ushort
nc_get_varm
nc_get_varm_double
nc_get_varm_float
nc_get_varm_int
nc_get_varm_long
nc_get_varm_longlong
nc_get_varm_schar
nc_get_varm_short
nc_get_varm_string
nc_get_varm_text
nc_get_varm_ubyte
nc_get_varm_uchar
nc_get_varm_uint
nc_get_varm_ulonglong
nc_get_varm_ushort
nc_get_vars
nc_get_vars_double
nc_get_vars_float
nc_get_vars_int
nc_get_vars_long
nc_get_vars_longlong
nc_get_vars_schar
nc_get_vars_short
nc_get_vars_string
nc_get_vars_text
nc_get_vars_ubyte
nc_get_vars_uchar
nc_get_vars_uint
nc_get_vars_ulonglong
nc_get_vars_ushort
nc_get_vlen_element
nc_inq
nc_inq_att
nc_inq_attid
nc_inq_attlen
nc_inq_attname
nc_inq_atttype
nc_inq_base_pe
nc_inq_compound
nc_inq_compound_field
nc_inq_compound_fielddim_sizes
nc_inq_compound_fieldindex
nc_inq_compound_fieldname
nc_inq_compound_fieldndims
nc_inq_compound_fieldoffset
nc_inq_compound_fieldtype
nc_inq_compound_name
nc_inq_compound_nfields
nc_inq_compound_size
nc_inq_dim
nc_inq_dimid
nc_inq_dimids
nc_inq_dimlen
nc_inq_dimname
nc_inq_enum
nc_inq_enum_ident
nc_inq_enum_member
nc_inq_format
nc_inq_format_extended
nc_inq_grp_full_ncid
nc_inq_grp_ncid
nc_inq_grp_parent
nc_inq_grpname
nc_inq_grpname_full
nc_inq_grpname_len
nc_inq_grps
nc_inq_libvers
nc_inq_natts
nc_inq_ncid
nc_inq_ndims
nc_inq_nvars
nc_inq_opaque
nc_inq_path
nc_inq_type
nc_inq_type_equal
nc_inq_typeid
nc_inq_typeids
nc_inq_unlimdim
nc_inq_unlimdims
nc_inq_user_type
nc_inq_var
nc_inq_var_chunking
nc_inq_var_chunking_ints
nc_inq_var_deflate
nc_inq_var_endian
nc_inq_var_fill
nc_inq_var_fletcher32
nc_inq_var_szip
nc_inq_vardimid
nc_inq_varid
nc_inq_varids
nc_inq_varname
nc_inq_varnatts
nc_inq_varndims
nc_inq_vartype
nc_inq_vlen
nc_insert_array_compound
nc_insert_compound
nc_insert_enum
nc_open
nc_put_att
nc_put_att_double
nc_put_att_float
nc_put_att_int
nc_put_att_long
nc_put_att_longlong
nc_put_att_schar
nc_put_att_short
nc_put_att_string
nc_put_att_text
nc_put_att_ubyte
nc_put_att_uchar
nc_put_att_uint
nc_put_att_ulonglong
nc_put_att_ushort
nc_put_var
nc_put_var1
nc_put_var1_double
nc_put_var1_float
nc_put_var1_int
nc_put_var1_long
nc_put_var1_longlong
nc_put_var1_schar
nc_put_var1_short
nc_put_var1_string
nc_put_var1_text
nc_put_var1_ubyte
nc_put_var1_uchar
nc_put_var1_uint
nc_put_var1_ulonglong
nc_put_var1_ushort
nc_put_var_double
nc_put_var_float
nc_put_var_int
nc_put_var_long
nc_put_var_longlong
nc_put_var_schar
nc_put_var_short
nc_put_var_string
nc_put_var_text
nc_put_var_ubyte
nc_put_var_uchar
nc_put_var_uint
nc_put_var_ulonglong
nc_put_var_ushort
nc_put_vara
nc_put_vara_double
nc_put_vara_float
nc_put_vara_int
nc_put_vara_long
nc_put_vara_longlong
nc_put_vara_schar
nc_put_vara_short
nc_put_vara_string
nc_put_vara_text
nc_put_vara_ubyte
nc_put_vara_uchar
nc_put_vara_uint
nc_put_vara_ulonglong
nc_put_vara_ushort
nc_put_varm
nc_put_varm_double
nc_put_varm_float
nc_put_varm_int
nc_put_varm_long
nc_put_varm_longlong
nc_put_varm_schar
nc_put_varm_short
nc_put_varm_string
nc_put_varm_text
nc_put_varm_ubyte
nc_put_varm_uchar
nc_put_varm_uint
nc_put_varm_ulonglong
nc_put_varm_ushort
nc_put_vars
nc_put_vars_double
nc_put_vars_float
nc_put_vars_int
nc_put_vars_long
nc_put_vars_longlong
nc_put_vars_schar
nc_put_vars_short
nc_put_vars_string
nc_put_vars_text
nc_put_vars_ubyte
nc_put_vars_uchar
nc_put_vars_uint
nc_put_vars_ulonglong
nc_put_vars_ushort
nc_put_vlen_element
nc_redef
nc_rename_att
nc_rename_dim
nc_rename_grp
nc_rename_var
nc_set_base_pe
nc_set_chunk_cache
nc_set_chunk_cache_ints
nc_set_default_format
nc_set_fill
nc_set_var_chunk_cache
nc_set_var_chunk_cache_ints
nc_show_metadata
nc_strerror
nc_sync
ncabort
ncattcopy
ncattdel
ncattget
ncattinq
ncattname
ncattput
ncattrename
ncclose
nccreate
ncdimdef
ncdimid
ncdiminq
ncdimrename
ncendef
ncerr
ncinquire
ncopen
ncopts
ncrecget
ncrecinq
ncrecput
ncredef
ncsetfill
ncsync
nctypelen
ncvardef
ncvarget
ncvarget1
ncvargetg
ncvargets
ncvarid
ncvarinq
ncvarput
ncvarput1
ncvarputg
ncvarputs
ncvarrename

New Python package

The UDG has incorporated a new python package.

This first version is only intended to help with the new http authentication. The UDG authentication requires handle http redirects and cookies.

  • Posted: 2014-01-23 20:25 (Updated: 2014-02-26 11:07)
  • Author: antonio
  • Categories: (none)
  • Comments (55)

Re-branding to ECOMS UDG

The data portal has been re-branded to ECOMS User Data Gateway (ECOMS UDG).

This re-branding has deprecated the oldish Portal term to Gateway term. This concept represents a better approach to the ECOMS UDG philosophy

Now the UDG is covering the ECOMS inititive. ECOMS is the framework for the SPECS, EUPORIAS and NACLIM project.

Therefore some wiki names and links has been rename and changed accordingly to this new new.

Please let us to know any broken link: drop us a ticket!!

The CFSRR dataset has been added to the portal

The CFSRR dataset has been added to the Data Portal

First version of the user data portal

Delivered the first version of the data server and access scripts (including only System4 datasets). See SpecsEuporias?'s wiki

The new Specs-Euporias Data Portal

Please drop her all your comments.

  • Posted: 2013-03-15 18:03
  • Author: antonio
  • Categories: (none)
  • Comments (279)

Disk failed replacement in OpenIndiana

root@seal:~# cat /var/adm/messages
Feb 16 10:55:50 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 10:55:50 seal    Log info 0x31080000 received for target 22.
Feb 16 10:55:50 seal    scsi_status=0x0, ioc_status=0x804b, scsi_state=0x0
Feb 16 11:01:29 seal scsi: [ID 107833 kern.warning] WARNING: /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Disconnected command timeout for Target 22
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:29 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:29 seal    Log info 0x31140000 received for target 22.
Feb 16 11:01:29 seal    scsi_status=0x0, ioc_status=0x8048, scsi_state=0xc
Feb 16 11:01:59 seal scsi: [ID 107833 kern.warning] WARNING: /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:01:59 seal    passthrough command timeout
Feb 16 11:02:01 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:02:01 seal    mpt2 Firmware version v14.0.0.0 (?)
Feb 16 11:02:01 seal scsi: [ID 365881 kern.info] /pci@7a,0/pci8086,340e@7/pci1000,3080@0 (mpt_sas2):
Feb 16 11:02:01 seal    mpt2: IOC Operational.
Feb 16 11:03:46 seal scsi_vhci: [ID 734749 kern.warning] WARNING: vhci_scsi_reset 0x1
root@seal:~# iostat -xne
                            extended device statistics       ---- errors ---
    r/s    w/s   kr/s   kw/s wait actv wsvc_t asvc_t  %w  %b s/w h/w trn tot device
  129.2    8.0 2325.4   25.6  0.0  0.8    0.0    6.2   0  18   0   0   0   0 c2t5000CCA369C5A420d0
   88.7    4.9 1923.5   22.8  0.0  0.4    0.0    4.4   0  10   0   0   0   0 c2t5000CCA369C59910d0
  129.1    8.0 2325.7   25.6  0.0  0.9    0.0    6.2   0  18   0   0   0   0 c2t5000CCA369C5A432d0
  115.3    4.3 2088.5   30.0  0.0  0.6    0.0    5.1   0  13   0   0   0   0 c2t5000CCA369C5A374d0
  115.4    4.3 2087.8   30.0  0.0  0.6    0.0    5.0   0  13   0   0   0   0 c2t5000CCA369C59954d0
  115.8    6.6 2098.5   24.9  0.0  0.6    0.0    5.1   0  14   0   0   0   0 c2t5000CCA369C54C04d0
  129.2    8.0 2325.4   25.6  0.0  0.8    0.0    6.2   0  18   0   0   0   0 c2t5000CCA369C52E05d0
  129.2    8.0 2325.4   25.6  0.0  0.8    0.0    6.2   0  18   0   0   0   0 c2t5000CCA369C5A416d0
  115.8    6.5 2098.0   24.9  0.0  0.6    0.0    5.2   0  14   0   0   0   0 c2t5000CCA369C55766d0
  115.2    4.3 2087.5   30.0  0.0  0.6    0.0    5.1   0  13   0   0   0   0 c2t5000CCA369C5A407d0
  115.4    6.6 2099.5   25.0  0.0  0.6    0.0    5.2   0  14   0   0   0   0 c2t5000CCA369C598A7d0
   88.5    5.0 1923.9   22.9  0.0  0.4    0.0    4.2   0   9   0   0   0   0 c2t5000CCA369C59907d0
  115.5    4.3 2085.9   30.0  0.0  0.6    0.0    5.0   0  13   0   0   0   0 c2t5000CCA369C5A409d0
  129.0    8.0 2326.2   25.6  0.0  0.9    0.0    6.2   0  18   0   0   0   0 c2t5000CCA369C5C19Ad0
  115.6    6.6 2099.1   25.0  0.0  0.6    0.0    5.2   0  14   0   3   7  10 c2t5000CCA369C554CAd0
    0.0   15.9    0.6  720.7  0.0  0.0    0.0    1.4   0   0   0   0   0   0 c2t5E83A974348B629Ad0
  115.8    6.6 2098.1   24.9  0.0  0.6    0.0    5.1   0  14   0   0   0   0 c2t5000CCA369C599ACd0
  115.0    4.3 2089.8   30.0  0.0  0.6    0.0    5.1   0  13   0   0   0   0 c2t5000CCA369C5A42Dd0
  115.4    4.3 2088.2   30.0  0.0  0.6    0.0    5.1   0  13   0   0   0   0 c2t5000CCA369C5A41Dd0
   88.4    5.0 1923.7   22.9  0.0  0.4    0.0    4.2   0   9   0   5   4   9 c2t5000CCA369C5190Dd0
    6.2   11.5   37.1  107.8  0.0  0.2    0.0   11.0   0   5   0   0   0   0 c3t0d0
   17.2    7.7  391.7  776.1  0.0  0.1    0.0    2.4   0   1   0   0   0   0 c5t2d0
   85.9    5.4 1913.6   26.1  0.0  0.5    0.0    5.0   0  11   0   0   0   0 c5t3d0
   85.1    5.5 1908.9   26.2  0.0  0.5    0.0    5.1   0  11   0   0   0   0 c5t4d0
  119.8    9.4 2361.0   30.7  0.0  1.0    0.0    8.1   0  21   0   0   0   0 c5t5d0
  120.1    9.4 2360.0   30.7  0.0  1.0    0.0    8.1   0  20   0   0   0   0 c5t6d0
  120.1    9.4 2360.1   30.7  0.0  1.0    0.0    8.0   0  20   0   0   0   0 c5t7d0
  104.8    6.0 2257.4   38.1  0.0  0.8    0.0    7.0   0  16   0   0   0   0 c5t8d0
  104.8    6.0 2257.5   38.1  0.0  0.8    0.0    6.9   0  16   0   0   0   0 c5t9d0
  104.5    5.9 2258.6   38.1  0.0  0.8    0.0    7.1   0  16   0   0   0   0 c5t10d0
  104.8    5.9 2257.5   38.1  0.0  0.8    0.0    6.9   0  16   0   0   0   0 c5t11d0
   94.2    5.4 1884.4   26.1  0.0  0.3    0.0    3.3   0   8   0   0   0   0 c5t12d0
   85.3    5.5 1927.2   26.2  0.0  0.5    0.0    5.3   0  11   0   0   0   0 c3t1d0
  138.1    8.0 2296.1   25.6  0.0  0.7    0.0    4.7   0  15   0  81   0  81 c2t5000CCA369C508E0d0
   85.2    5.5 1909.0   26.2  0.0  0.5    0.0    5.1   0  11   0   0   0   0 c3t3d0
   85.7    5.5 1925.7   26.2  0.0  0.5    0.0    5.1   0  11   0   0   0   0 c3t4d0
   18.3    7.6  378.7  774.8  0.0  0.1    0.0    2.3   0   1   0   0   0   0 c3t5d0
  103.9    5.9 2260.3   38.1  0.0  0.8    0.0    7.1   0  16   0   0   0   0 c3t6d0
   85.5    5.5 1926.5   26.2  0.0  0.5    0.0    5.3   0  11   0   0   0   0 c3t7d0
   85.1    5.5 1909.5   26.2  0.0  0.5    0.0    5.2   0  11   0   0   0   0 c3t8d0
   85.2    5.4 1916.0   26.1  0.0  0.5    0.0    5.2   0  11   0   0   0   0 c3t9d0
  119.6    9.3 2362.0   30.7  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t10d0
  119.9    9.3 2361.3   30.7  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t11d0
  119.6    9.4 2362.4   30.7  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t12d0
  120.0    9.4 2361.2   30.8  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t13d0
  119.8    9.4 2361.8   30.8  0.0  1.1    0.0    8.3   0  21   0   0   0   0 c3t14d0
  119.3    9.4 2363.7   30.8  0.0  1.1    0.0    8.4   0  21   0   0   0   0 c3t15d0
  119.6    9.3 2362.5   30.7  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t16d0
  119.8    9.3 2361.7   30.7  0.0  1.1    0.0    8.2   0  21   0   0   0   0 c3t17d0
  103.5    5.9 2261.8   38.1  0.0  0.8    0.0    7.2   0  16   0   0   0   0 c3t18d0
  104.2    5.9 2259.4   38.1  0.0  0.8    0.0    7.0   0  16   0   0   0   0 c3t19d0
  103.7    5.9 2261.1   38.1  0.0  0.8    0.0    7.2   0  16   0   0   0   0 c3t20d0
  104.0    5.9 2260.2   38.1  0.0  0.8    0.0    7.0   0  16   0   0   0   0 c3t21d0
  103.9    5.9 2260.5   38.1  0.0  0.8    0.0    7.1   0  16   0   0   0   0 c3t22d0
  104.4    5.9 2259.0   38.1  0.0  0.8    0.0    7.0   0  16   0   0   0   0 c3t23d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.4   0   0   0   0   0   0 c3t24d0
    0.0   15.9    0.6  720.7  0.0  0.0    0.0    1.0   0   0   0  92   0  92 c2t5E83A972B7F39C50d0
  125.0    6.6 2067.4   25.0  0.0  0.5    0.0    3.5   0  11   0  81   0  81 c2t5000CCA369C50370d0
   94.5    5.0 1901.5   22.9  0.0  0.3    0.0    2.6   0   7   0  76   0  76 c2t5000CCA369C47080d0
  124.4    4.4 2058.2   30.0  0.0  0.4    0.0    3.5   0  10   0  88   0  88 c2t5000CCA369C504D1d0
  124.6    6.6 2068.8   24.9  0.0  0.5    0.0    3.6   0  11   0  85   3  88 c2t5000CCA369C508E5d0
  138.2    8.0 2295.7   25.6  0.0  0.7    0.0    4.7   0  15   0  81   0  81 c2t5000CCA369C505D5d0
   94.4    4.9 1896.4   22.7  0.0  0.3    0.0    2.6   0   7   0  75   0  75 c2t5000CCA369C51558d0
  124.3    4.4 2058.7   30.0  0.0  0.4    0.0    3.5   0  10   0  88   0  88 c2t5000CCA369C4F888d0
  138.0    8.0 2296.2   25.6  0.0  0.7    0.0    4.8   0  15   0  81   0  81 c2t5000CCA369C508C9d0
  125.0    6.5 2066.8   24.9  0.0  0.5    0.0    3.5   0  11   0  81   0  81 c2t5000CCA369C50679d0
  138.2    8.0 2295.6   25.6  0.0  0.7    0.0    4.7   0  15   0  80   0  80 c2t5000CCA369C50609d0
   94.9    4.9 1899.6   22.8  0.0  0.3    0.0    2.7   0   7   0  74   0  74 c2t5000CCA369C5177Bd0
   94.9    4.9 1899.3   22.8  0.0  0.3    0.0    2.7   0   7   0  75   0  75 c2t5000CCA369C5084Bd0
  138.2    8.0 2295.8   25.6  0.0  0.7    0.0    4.7   0  15   0  81   0  81 c2t5000CCA369C506BBd0
  123.8    4.3 2061.0   30.0  0.0  0.5    0.0    3.6   0  11   0  88   0  88 c2t5000CCA369C4E90Bd0
  124.8    6.6 2068.0   24.9  0.0  0.5    0.0    3.5   0  11   0  81   0  81 c2t5000CCA369C509ECd0
  124.6    6.6 2068.8   25.0  0.0  0.5    0.0    3.5   0  11   0  80   0  80 c2t5000CCA369C508ECd0
   94.5    4.9 1896.1   22.7  0.0  0.3    0.0    2.6   0   7   0  75   0  75 c2t5000CCA369D347CEd0
  124.7    6.5 2068.2   24.9  0.0  0.5    0.0    3.5   0  11   0  81   0  81 c2t5000CCA369C5026Ed0
   94.6    4.9 1895.8   22.7  0.0  0.3    0.0    2.6   0   7   0  75   0  75 c2t5000CCA369C5178Fd0
  124.2    4.4 2059.3   30.0  0.0  0.5    0.0    3.5   0  10   0  88   0  88 c2t5000CCA369C50F1Fd0
  138.1    8.0 2295.7   25.6  0.0  0.7    0.0    4.7   0  15   0  81   0  81 c2t5000CCA369C506AFd0
  123.0    4.3 2044.5   30.0  0.1  0.8    0.7    6.1   2  16   0 1903 4197 6100 c2t5000CCA369C50680d0
    5.6   11.5   43.5  107.8  0.0  0.2    0.0   11.0   0   5   0   0   0   0 c5t1d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.0   0   0   1   0   0   1 c2t50014EE205B5C957d0
    0.3   27.7    1.4 2277.8  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC58190d0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC53370d0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC56E70d0
    0.3   27.8    1.4 2276.9  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60F80d0
    0.3   27.8    1.3 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FCB0d0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC60DA0d0
    0.4   27.8    1.7 2277.8  0.0  1.0    0.0   37.1   0  11   0   0   0   0 c2t5000CCA37DC55170d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60AA1d0
    0.3   27.7    1.4 2277.9  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC5F9B1d0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC52AC1d0
    0.3   27.7    1.4 2277.6  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5F831d0
    0.3   27.7    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC55121d0
    0.3   27.8    1.4 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60C21d0
    0.3   27.8    1.4 2277.6  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC31681d0
    0.3   27.8    1.3 2277.9  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC60F61d0
    0.3   27.8    1.3 2277.8  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5FEC1d0
    0.3   27.7    1.4 2277.8  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC53322d0
    0.3   27.7    1.4 2277.9  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC5FAD2d0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC350B2d0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC59312d0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC56F33d0
    0.3   27.8    1.4 2277.7  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5E303d0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC59433d0
    0.3   27.8    1.5 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5E343d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC60AD4d0
    0.3   27.8    1.4 2277.9  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FDA4d0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC60D34d0
    0.4   27.8    1.7 2277.0  0.0  1.1    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC4C814d0
    0.3   27.8    1.4 2277.9  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5FEB4d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.6   0   0   0   0   0   0 c2t5000CCA37DC53385d0
    0.3   27.7    1.4 2277.1  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC61E25d0
    0.3   27.7    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FB35d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.6   0   0   0   0   0   0 c2t5000CCA37DC53395d0
    0.4   27.8    1.8 2278.0  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC56FB5d0
    0.3   27.8    1.4 2277.9  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60345d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.5   0   0   0   0   0   0 c2t5000CCA37DC5FF35d0
    0.3   27.8    1.4 2277.8  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC53335d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC53296d0
    0.3   27.8    1.4 2277.6  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC58926d0
    0.3   27.8    1.3 2277.8  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC58956d0
    0.3   27.8    1.4 2277.6  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC28966d0
    0.3   27.8    1.4 2277.8  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5FF06d0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC56D06d0
    0.3   27.8    1.4 2278.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC58576d0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC5FCA7d0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC4C877d0
    0.3   27.7    1.4 2277.6  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC532C8d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60368d0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FC58d0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5FDB8d0
    0.3   27.8    1.4 2276.9  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC57008d0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.6   0   0   0   0   0   0 c2t5000CCA37DC4FD78d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60CB9d0
    0.4   27.8    1.5 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC59489d0
    0.3   27.7    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC3519Ad0
    0.3   27.8    1.4 2278.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5F76Ad0
    0.3   27.7    1.4 2277.9  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC619DAd0
    0.3   27.7    1.5 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5BBAAd0
    0.4   27.8    1.5 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5881Ad0
    0.3   27.7    1.5 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5E18Ad0
    0.3   27.8    1.5 2277.2  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC5FF0Ad0
    0.3   27.8    1.3 2277.9  0.0  1.0    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC57FCAd0
    0.3   27.8    1.4 2277.7  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC56D0Ad0
    0.3   27.7    1.3 2278.0  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC5889Bd0
    0.3   27.7    1.4 2277.7  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC581EBd0
    0.3   27.7    1.5 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5F9FBd0
    0.3   27.7    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC61D8Bd0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC60CCBd0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FEABd0
    0.3   27.8    1.3 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC602FCd0
    0.4   27.8    1.7 2277.9  0.0  1.1    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC60E7Cd0
    0.3   27.8    1.4 2277.8  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC6032Cd0
    0.3   27.8    1.4 2277.7  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5FD7Cd0
    0.3   27.8    1.3 2278.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC5335Cd0
    0.3   27.7    1.4 2277.8  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC61D7Dd0
    0.3   27.8    1.4 2277.7  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC61D6Dd0
    0.4   27.7    1.7 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC6172Dd0
    0.4   27.7    1.5 2277.1  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5891Dd0
    0.3   27.8    1.4 2277.0  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC4B7BDd0
    0.3   27.8    1.4 2277.9  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC56E8Dd0
    0.3   27.7    1.3 2277.9  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC532CEd0
    0.4   27.8    1.5 2277.8  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC6275Ed0
    0.3   27.7    1.4 2277.7  0.0  1.1    0.0   37.5   0  11   0   0   0   0 c2t5000CCA37DC59DAEd0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.6   0   0   0   0   0   0 c2t5000CCA37DC56D8Ed0
    0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.6   0   0   0   0   0   0 c2t5000CCA37DC5336Ed0
    0.3   27.8    1.4 2277.0  0.0  1.1    0.0   37.4   0  11   0   0   0   0 c2t5000CCA37DC5FC4Ed0
    0.4   27.7    1.7 2277.0  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC60CAFd0
    0.3   27.8    1.4 2277.9  0.0  1.1    0.0   37.3   0  11   0   0   0   0 c2t5000CCA37DC60FBFd0
    0.3   27.8    1.4 2277.7  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC56FEFd0
    0.3   27.8    1.4 2277.7  0.0  1.0    0.0   37.2   0  11   0   0   0   0 c2t5000CCA37DC4B46Fd0
root@seal:~# zpool offline oceano c2t5000CCA369C50680d0
root@seal:~# zpool replace oceano c2t5000CCA369C50680d0 c3t24d0
root@seal:~$ zpool status oceano
  pool: oceano
 state: DEGRADED
status: One or more devices is currently being resilvered.  The pool will
        continue to function, possibly in a degraded state.
action: Wait for the resilver to complete.
  scan: resilver in progress since Sat Feb 16 11:06:51 2013
    23.3G scanned out of 128T at 15.5M/s, (scan is slow, no estimated time)
    13.7M resilvered, 0.02% done
config:

        NAME                         STATE     READ WRITE CKSUM
        oceano                       DEGRADED     0     0     0
          raidz2-0                   ONLINE       0     0     0
            c2t5000CCA369C5A416d0    ONLINE       0     0     0
            c2t5000CCA369C5A420d0    ONLINE       0     0     0
            c2t5000CCA369C5A432d0    ONLINE       0     0     0
            c2t5000CCA369C505D5d0    ONLINE       0     0     0
            c2t5000CCA369C506AFd0    ONLINE       0     0     0
            c2t5000CCA369C506BBd0    ONLINE       0     0     0
            c2t5000CCA369C5C19Ad0    ONLINE       0     0     0
            c2t5000CCA369C508C9d0    ONLINE       0     0     0
            c2t5000CCA369C52E05d0    ONLINE       0     0     0
            c2t5000CCA369C508E0d0    ONLINE       0     0     0
            c2t5000CCA369C50609d0    ONLINE       0     0     0
          raidz2-1                   ONLINE       0     0     0
            c5t5d0                   ONLINE       0     0     0
            c5t6d0                   ONLINE       0     0     0
            c5t7d0                   ONLINE       0     0     0
            c3t10d0                  ONLINE       0     0     0
            c3t11d0                  ONLINE       0     0     0
            c3t12d0                  ONLINE       0     0     0
            c3t13d0                  ONLINE       0     0     0
            c3t14d0                  ONLINE       0     0     0
            c3t15d0                  ONLINE       0     0     0
            c3t16d0                  ONLINE       0     0     0
            c3t17d0                  ONLINE       0     0     0
          raidz2-2                   ONLINE       0     0     0
            c5t8d0                   ONLINE       0     0     0
            c5t9d0                   ONLINE       0     0     0
            c5t10d0                  ONLINE       0     0     0
            c5t11d0                  ONLINE       0     0     0
            c3t6d0                   ONLINE       0     0     0
            c3t18d0                  ONLINE       0     0     0
            c3t19d0                  ONLINE       0     0     0
            c3t20d0                  ONLINE       0     0     0
            c3t21d0                  ONLINE       0     0     0
            c3t22d0                  ONLINE       0     0     0
            c3t23d0                  ONLINE       0     0     0
          raidz2-3                   DEGRADED     0     0     0
            c2t5000CCA369C5A41Dd0    ONLINE       0     0     0
            c2t5000CCA369C4E90Bd0    ONLINE       0     0     0
            c2t5000CCA369C5A42Dd0    ONLINE       0     0     0
            c2t5000CCA369C4F888d0    ONLINE       0     0     0
            c2t5000CCA369C5A374d0    ONLINE       0     0     0
            c2t5000CCA369C50F1Fd0    ONLINE       0     0     0
            c2t5000CCA369C5A407d0    ONLINE       0     0     0
            spare-7                  OFFLINE      0     0     0
              c2t5000CCA369C50680d0  OFFLINE      0     0     0
              c3t24d0                ONLINE       0     0     0  (resilvering)
            c2t5000CCA369C5A409d0    ONLINE       0     0     0
            c2t5000CCA369C504D1d0    ONLINE       0     0     0
            c2t5000CCA369C59954d0    ONLINE       0     0     0
          raidz2-4                   ONLINE       0     0     0
            c2t5000CCA369C55766d0    ONLINE       0     0     0
            c2t5000CCA369C508E5d0    ONLINE       0     0     0
            c2t5000CCA369C54C04d0    ONLINE       0     0     0
            c2t5000CCA369C508ECd0    ONLINE       0     0     0
            c2t5000CCA369C554CAd0    ONLINE       0     0     0
            c2t5000CCA369C50370d0    ONLINE       0     0     0
            c2t5000CCA369C598A7d0    ONLINE       0     0     0
            c2t5000CCA369C509ECd0    ONLINE       0     0     0
            c2t5000CCA369C599ACd0    ONLINE       0     0     0
            c2t5000CCA369C5026Ed0    ONLINE       0     0     0
            c2t5000CCA369C50679d0    ONLINE       0     0     0
          raidz2-5                   ONLINE       0     0     0
            c2t5000CCA369D347CEd0    ONLINE       0     0     0
            c2t5000CCA369C5084Bd0    ONLINE       0     0     0
            c2t5000CCA369C5190Dd0    ONLINE       0     0     0
            c2t5000CCA369C51558d0    ONLINE       0     0     0
            c2t5000CCA369C5177Bd0    ONLINE       0     0     0
            c2t5000CCA369C59907d0    ONLINE       0     0     0
            c2t5000CCA369C5178Fd0    ONLINE       0     0     0
            c2t5000CCA369C59910d0    ONLINE       0     0     0
            c2t5000CCA369C47080d0    ONLINE       0     0     0
          raidz2-6                   ONLINE       0     0     0
            c5t4d0                   ONLINE       0     0     0
            c5t12d0                  ONLINE       0     0     0
            c3t7d0                   ONLINE       0     0     0
            c3t8d0                   ONLINE       0     0     0
            c3t9d0                   ONLINE       0     0     0
            c3t1d0                   ONLINE       0     0     0
            c3t3d0                   ONLINE       0     0     0
            c5t3d0                   ONLINE       0     0     0
            c3t4d0                   ONLINE       0     0     0
        logs
          mirror-7                   ONLINE       0     0     0
            c2t5E83A972B7F39C50d0    ONLINE       0     0     0
            c2t5E83A974348B629Ad0    ONLINE       0     0     0
        cache
          c5t2d0                     ONLINE       0     0     0
          c3t5d0                     ONLINE       0     0     0
        spares
          c3t24d0                    INUSE     currently in use

errors: No known data errors

And now we locate the disk following the instructions on DiskLocationOpenindiana

root@seal:~# sg_inq -u /dev/rdsk/c2t5000CCA369C50680d0
SCSI_IDENT_LUN_NAA=5000cca369c50680
SCSI_IDENT_PORT_NAA=5003048001155a4c
SCSI_IDENT_PORT_RELATIVE=1
root@seal:~# sg_inq -u /dev/es/ses2
SCSI_IDENT_PORT_NAA=5003048001155a7d
SCSI_IDENT_LUN_NAA=5003048001155a7d
root@seal:~# sg_ses -I 0,0 -p aes /dev/es/ses2
  LSI CORP  SAS2X36           0717
  Primary enclosure logical identifier (hex): 5003048001155a7f
Additional element status diagnostic page:
  generation code: 0x0
  additional element status descriptor list
      Element index: 0
        Transport protocol: SAS
        number of phys: 1, not all phys: 0, device slot number: 0
        phy index: 0
          device type: no device attached
          initiator port for:
          target port for: SATA_device
          attached SAS address: 0x5003048001155a7f
          SAS address: 0x5003048001155a4c
          phy identifier: 0x0
root@seal:~# sg_ses -I 0,0 --set=fault /dev/es/ses2
root@seal:~# sg_ses -I 0,0 --get=fault /dev/es/ses2
1

Send an receive backup in ZFS

root@seal:~# zfs list -rt all oceano/pruebas
NAME                                  USED  AVAIL  REFER  MOUNTPOINT
oceano/pruebas                        253G  1.42T  70.7K  /oceano/pruebas
oceano/pruebas/users_seal             253G  1.42T   212G  /oceano/pruebas/users_seal
oceano/pruebas/users_seal/anonimo    98.9K  1.42T  98.9K  /oceano/pruebas/users_seal/anonimo
oceano/pruebas/users_seal/antonio    20.0G  1.42T  20.0G  /oceano/pruebas/users_seal/antonio
oceano/pruebas/users_seal/carlos     1.36G  1.42T  1.36G  /oceano/pruebas/users_seal/carlos
oceano/pruebas/users_seal/carmen     6.64G  1.42T  6.64G  /oceano/pruebas/users_seal/carmen
oceano/pruebas/users_seal/casanueva  1.29M  1.42T  1.29M  /oceano/pruebas/users_seal/casanueva
oceano/pruebas/users_seal/ccabrillo   551M  1.42T   551M  /oceano/pruebas/users_seal/ccabrillo
oceano/pruebas/users_seal/chus       7.21G  1.42T  7.21G  /oceano/pruebas/users_seal/chus
oceano/pruebas/users_seal/daniel     5.28G  1.42T  5.28G  /oceano/pruebas/users_seal/daniel
root@seal:~# zfs list -rt all depot
NAME    USED  AVAIL  REFER  MOUNTPOINT
depot  1.87M   114T   329K  /depot
root@seal:~# zfs snapshot -r oceano/pruebas@backup1
root@seal:~# zfs send -vR oceano/pruebas@backup1 | zfs recv -vFd depot
send from @ to oceano/pruebas@backup1 estimated size is 16K
send from @ to oceano/pruebas/users_seal@backup1 estimated size is 211G
send from @ to oceano/pruebas/users_seal/carlos@backup1 estimated size is 1.34G
send from @ to oceano/pruebas/users_seal/chus@backup1 estimated size is 7.11G
send from @ to oceano/pruebas/users_seal/anonimo@backup1 estimated size is 33K
send from @ to oceano/pruebas/users_seal/daniel@backup1 estimated size is 5.22G
send from @ to oceano/pruebas/users_seal/casanueva@backup1 estimated size is 1.21M
send from @ to oceano/pruebas/users_seal/antonio@backup1 estimated size is 19.8G
send from @ to oceano/pruebas/users_seal/carmen@backup1 estimated size is 6.64G
send from @ to oceano/pruebas/users_seal/ccabrillo@backup1 estimated size is 505M
total estimated size is 251G
TIME        SENT   SNAPSHOT
receiving full stream of oceano/pruebas@backup1 into depot/pruebas@backup1
received 47.9KB stream in 1 seconds (47.9KB/sec)
TIME        SENT   SNAPSHOT
receiving full stream of oceano/pruebas/users_seal@backup1 into depot/pruebas/users_seal@backup1
20:26:40    313M   oceano/pruebas/users_seal@backup1
20:26:41    763M   oceano/pruebas/users_seal@backup1
20:26:42    885M   oceano/pruebas/users_seal@backup1
20:26:43   1.13G   oceano/pruebas/users_seal@backup1
20:26:44   1.56G   oceano/pruebas/users_seal@backup1
20:26:45   1.81G   oceano/pruebas/users_seal@backup1
20:26:46   2.13G   oceano/pruebas/users_seal@backup1
20:26:47   2.65G   oceano/pruebas/users_seal@backup1
20:26:48   2.90G   oceano/pruebas/users_seal@backup1
....................................................
20:31:38   77.8G   oceano/pruebas/users_seal@backup1
20:31:39   78.1G   oceano/pruebas/users_seal@backup1
20:31:40   78.5G   oceano/pruebas/users_seal@backup1
20:31:41   78.9G   oceano/pruebas/users_seal@backup1
20:31:42   79.2G   oceano/pruebas/users_seal@backup1
20:31:43   79.4G   oceano/pruebas/users_seal@backup1
....................................................
20:48:22    209G   oceano/pruebas/users_seal@backup1
20:48:23    210G   oceano/pruebas/users_seal@backup1
20:48:24    210G   oceano/pruebas/users_seal@backup1
20:48:25    210G   oceano/pruebas/users_seal@backup1
20:48:26    210G   oceano/pruebas/users_seal@backup1
20:48:27    210G   oceano/pruebas/users_seal@backup1
20:48:28    210G   oceano/pruebas/users_seal@backup1
20:48:29    211G   oceano/pruebas/users_seal@backup1
20:48:30    211G   oceano/pruebas/users_seal@backup1
20:48:31    211G   oceano/pruebas/users_seal@backup1
20:48:32    211G   oceano/pruebas/users_seal@backup1
20:48:33    212G   oceano/pruebas/users_seal@backup1
20:48:34    212G   oceano/pruebas/users_seal@backup1
20:48:35    212G   oceano/pruebas/users_seal@backup1
TIME        SENT   SNAPSHOT
20:48:37   16.3K   oceano/pruebas/users_seal/carlos@backup1
received 212GB stream in 1319 seconds (164MB/sec)
receiving full stream of oceano/pruebas/users_seal/carlos@backup1 into depot/pruebas/users_seal/carlos@backup1
20:48:38   16.6K   oceano/pruebas/users_seal/carlos@backup1
20:48:39    378M   oceano/pruebas/users_seal/carlos@backup1
20:48:40    433M   oceano/pruebas/users_seal/carlos@backup1
20:48:41    552M   oceano/pruebas/users_seal/carlos@backup1
20:48:42    972M   oceano/pruebas/users_seal/carlos@backup1
20:48:43   1.06G   oceano/pruebas/users_seal/carlos@backup1
20:48:44   1.26G   oceano/pruebas/users_seal/carlos@backup1
20:48:45   1.34G   oceano/pruebas/users_seal/carlos@backup1
TIME        SENT   SNAPSHOT
received 1.36GB stream in 8 seconds (174MB/sec)
receiving full stream of oceano/pruebas/users_seal/chus@backup1 into depot/pruebas/users_seal/chus@backup1
20:48:47    290M   oceano/pruebas/users_seal/chus@backup1
20:48:48    789M   oceano/pruebas/users_seal/chus@backup1
20:48:49   1.11G   oceano/pruebas/users_seal/chus@backup1
20:48:50   1.39G   oceano/pruebas/users_seal/chus@backup1
20:48:51   1.75G   oceano/pruebas/users_seal/chus@backup1
20:48:52   2.08G   oceano/pruebas/users_seal/chus@backup1
20:48:53   2.29G   oceano/pruebas/users_seal/chus@backup1
.........................................................
20:49:14   6.46G   oceano/pruebas/users_seal/chus@backup1
20:49:15   6.68G   oceano/pruebas/users_seal/chus@backup1
20:49:16   6.75G   oceano/pruebas/users_seal/chus@backup1
20:49:17   6.85G   oceano/pruebas/users_seal/chus@backup1
20:49:18   7.16G   oceano/pruebas/users_seal/chus@backup1
TIME        SENT   SNAPSHOT
received 7.19GB stream in 33 seconds (223MB/sec)
receiving full stream of oceano/pruebas/users_seal/anonimo@backup1 into depot/pruebas/users_seal/anonimo@backup1
20:49:19   17.2K   oceano/pruebas/users_seal/anonimo@backup1
TIME        SENT   SNAPSHOT
received 76.8KB stream in 1 seconds (76.8KB/sec)
receiving full stream of oceano/pruebas/users_seal/daniel@backup1 into depot/pruebas/users_seal/daniel@backup1
20:49:21    342M   oceano/pruebas/users_seal/daniel@backup1
20:49:22    568M   oceano/pruebas/users_seal/daniel@backup1
20:49:23    619M   oceano/pruebas/users_seal/daniel@backup1
20:49:24    798M   oceano/pruebas/users_seal/daniel@backup1
20:49:25    817M   oceano/pruebas/users_seal/daniel@backup1
...........................................................
20:49:37   3.75G   oceano/pruebas/users_seal/daniel@backup1
20:49:38   3.97G   oceano/pruebas/users_seal/daniel@backup1
20:49:39   4.40G   oceano/pruebas/users_seal/daniel@backup1
20:49:40   4.79G   oceano/pruebas/users_seal/daniel@backup1
20:49:41   5.04G   oceano/pruebas/users_seal/daniel@backup1
TIME        SENT   SNAPSHOT
received 5.27GB stream in 22 seconds (245MB/sec)
receiving full stream of oceano/pruebas/users_seal/casanueva@backup1 into depot/pruebas/users_seal/casanueva@backup1
TIME        SENT   SNAPSHOT
received 1.29MB stream in 1 seconds (1.29MB/sec)
receiving full stream of oceano/pruebas/users_seal/antonio@backup1 into depot/pruebas/users_seal/antonio@backup1
20:49:44    238M   oceano/pruebas/users_seal/antonio@backup1
20:49:45    703M   oceano/pruebas/users_seal/antonio@backup1
20:49:46   1001M   oceano/pruebas/users_seal/antonio@backup1
20:49:47   1.35G   oceano/pruebas/users_seal/antonio@backup1
............................................................
20:51:03   19.2G   oceano/pruebas/users_seal/antonio@backup1
20:51:04   19.6G   oceano/pruebas/users_seal/antonio@backup1
20:51:05   19.8G   oceano/pruebas/users_seal/antonio@backup1
20:51:06   19.9G   oceano/pruebas/users_seal/antonio@backup1
TIME        SENT   SNAPSHOT
received 19.9GB stream in 86 seconds (237MB/sec)
receiving full stream of oceano/pruebas/users_seal/carmen@backup1 into depot/pruebas/users_seal/carmen@backup1
20:51:09   16.6K   oceano/pruebas/users_seal/carmen@backup1
20:51:10    410M   oceano/pruebas/users_seal/carmen@backup1
20:51:11    787M   oceano/pruebas/users_seal/carmen@backup1
20:51:12   1.03G   oceano/pruebas/users_seal/carmen@backup1
20:51:13   1.24G   oceano/pruebas/users_seal/carmen@backup1
...........................................................
20:51:25   5.39G   oceano/pruebas/users_seal/carmen@backup1
20:51:26   5.87G   oceano/pruebas/users_seal/carmen@backup1
20:51:27   6.20G   oceano/pruebas/users_seal/carmen@backup1
20:51:28   6.52G   oceano/pruebas/users_seal/carmen@backup1
TIME        SENT   SNAPSHOT
received 6.65GB stream in 20 seconds (340MB/sec)
receiving full stream of oceano/pruebas/users_seal/ccabrillo@backup1 into depot/pruebas/users_seal/ccabrillo@backup1
20:51:30   7.54M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:31   41.3M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:32   63.5M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:33   85.7M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:34    118M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:35    138M   oceano/pruebas/users_seal/ccabrillo@backup1
20:51:36    350M   oceano/pruebas/users_seal/ccabrillo@backup1
received 549MB stream in 8 seconds (68.6MB/sec)
root@seal:~# ls /oceano/pruebas/users_seal
anonimo  carmen     chus    ensembles  gutierjm  irene  lluis       maider      markel     Obsoletos    pazo    rodri           rsync_users.log  sixto      valva
antonio  casanueva  daniel  fernando   ifca      ivan   lost+found  margherita  maxtuni    ortin        prueba  rodrigma        sarah            swen
carlos   ccabrillo  dfrias  GRID       imeteo    juaco  luismi      mariona     minguezsr  pararevisar  rafa    rsync_user.log  scarab           umeteoadm
root@seal:~# rm -fr /oceano/pruebas/users_seal/sixto
root@seal:~# ls /oceano/pruebas/users_seal
anonimo  carmen     chus    ensembles  gutierjm  irene  lluis       maider      markel     Obsoletos    pazo    rodri           rsync_users.log  swen
antonio  casanueva  daniel  fernando   ifca      ivan   lost+found  margherita  maxtuni    ortin        prueba  rodrigma        sarah            umeteoadm
carlos   ccabrillo  dfrias  GRID       imeteo    juaco  luismi      mariona     minguezsr  pararevisar  rafa    rsync_user.log  scarab           valva
root@seal:~# rsync -auv /oceano/pruebas/users_seal/scarab /oceano/pruebas/users_seal/antonio
.........................
scarab/WEB/scarab-1.0-b20/xdocs/misc/httpclient-example.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/i18n.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/index.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/package_dependencies.gif
scarab/WEB/scarab-1.0-b20/xdocs/misc/package_dependencies.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/project_development_tools_and_technology.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/scarab-design.xml
scarab/WEB/scarab-1.0-b20/xdocs/misc/scarab-i18n.png
scarab/WEB/scarab-1.0-b20/xdocs/misc/workflow.xml
scarab/WEB/scarab-1.0-b20/xdocs/xmlimport/
scarab/WEB/scarab-1.0-b20/xdocs/xmlimport/import-issues.xml
scarab/WEB/scarab-1.0-b20/xdocs/xmlimport/index.xml
scarab/WEB/scarab-1.0-b20/xdocs/xmlimport/xml-format.xml

sent 291317814 bytes  received 110662 bytes  6008834.56 bytes/sec
total size is 290889987  speedup is 1.00
root@seal:~# zfs list -rt all oceano/pruebas
NAME                                          USED  AVAIL  REFER  MOUNTPOINT
oceano/pruebas                                253G  1.41T  70.7K  /oceano/pruebas
oceano/pruebas@backup1                           0      -  70.7K  -
oceano/pruebas/users_seal                     253G  1.41T   211G  /oceano/pruebas/users_seal
oceano/pruebas/users_seal@backup1             262M      -   212G  -
oceano/pruebas/users_seal/anonimo             143K  1.41T  99.3K  /oceano/pruebas/users_seal/anonimo
oceano/pruebas/users_seal/anonimo@backup1    43.5K      -  98.9K  -
oceano/pruebas/users_seal/antonio            20.3G  1.41T  20.2G  /oceano/pruebas/users_seal/antonio
oceano/pruebas/users_seal/antonio@backup1    4.74M      -  20.0G  -
oceano/pruebas/users_seal/carlos             1.36G  1.41T  1.36G  /oceano/pruebas/users_seal/carlos
oceano/pruebas/users_seal/carlos@backup1         0      -  1.36G  -
oceano/pruebas/users_seal/carmen             6.64G  1.41T  6.64G  /oceano/pruebas/users_seal/carmen
oceano/pruebas/users_seal/carmen@backup1         0      -  6.64G  -
oceano/pruebas/users_seal/casanueva          1.29M  1.41T  1.29M  /oceano/pruebas/users_seal/casanueva
oceano/pruebas/users_seal/casanueva@backup1      0      -  1.29M  -
oceano/pruebas/users_seal/ccabrillo           551M  1.41T   551M  /oceano/pruebas/users_seal/ccabrillo
oceano/pruebas/users_seal/ccabrillo@backup1      0      -   551M  -
oceano/pruebas/users_seal/chus               7.21G  1.41T  7.21G  /oceano/pruebas/users_seal/chus
oceano/pruebas/users_seal/chus@backup1           0      -  7.21G  -
oceano/pruebas/users_seal/daniel             5.28G  1.41T  5.28G  /oceano/pruebas/users_seal/daniel
oceano/pruebas/users_seal/daniel@backup1         0      -  5.28G  -
root@seal:~# zfs list -rt all depot
NAME                                         USED  AVAIL  REFER  MOUNTPOINT
depot                                        264G   114T   347K  /depot
depot/pruebas                                264G   114T   347K  /depot/pruebas
depot/pruebas@backup1                           0      -   347K  -
depot/pruebas/users_seal                     264G   114T   220G  /depot/pruebas/users_seal
depot/pruebas/users_seal@backup1                0      -   220G  -
depot/pruebas/users_seal/anonimo             494K   114T   494K  /depot/pruebas/users_seal/anonimo
depot/pruebas/users_seal/anonimo@backup1        0      -   494K  -
depot/pruebas/users_seal/antonio            21.3G   114T  21.3G  /depot/pruebas/users_seal/antonio
depot/pruebas/users_seal/antonio@backup1        0      -  21.3G  -
depot/pruebas/users_seal/carlos             1.49G   114T  1.49G  /depot/pruebas/users_seal/carlos
depot/pruebas/users_seal/carlos@backup1         0      -  1.49G  -
depot/pruebas/users_seal/carmen             6.69G   114T  6.69G  /depot/pruebas/users_seal/carmen
depot/pruebas/users_seal/carmen@backup1         0      -  6.69G  -
depot/pruebas/users_seal/casanueva          1.73M   114T  1.73M  /depot/pruebas/users_seal/casanueva
depot/pruebas/users_seal/casanueva@backup1      0      -  1.73M  -
depot/pruebas/users_seal/ccabrillo           898M   114T   898M  /depot/pruebas/users_seal/ccabrillo
depot/pruebas/users_seal/ccabrillo@backup1      0      -   898M  -
depot/pruebas/users_seal/chus               8.06G   114T  8.06G  /depot/pruebas/users_seal/chus
depot/pruebas/users_seal/chus@backup1           0      -  8.06G  -
depot/pruebas/users_seal/daniel             5.73G   114T  5.73G  /depot/pruebas/users_seal/daniel
depot/pruebas/users_seal/daniel@backup1         0      -  5.73G  -
root@seal:~# zfs snapshot -r oceano/pruebas@backup2
root@seal:~# zfs list -rt all oceano/pruebas
NAME                                          USED  AVAIL  REFER  MOUNTPOINT
oceano/pruebas                                253G  1.41T  70.7K  /oceano/pruebas
oceano/pruebas@backup1                           0      -  70.7K  -
oceano/pruebas@backup2                           0      -  70.7K  -
oceano/pruebas/users_seal                     253G  1.41T   211G  /oceano/pruebas/users_seal
oceano/pruebas/users_seal@backup1             262M      -   212G  -
oceano/pruebas/users_seal@backup2                0      -   211G  -
oceano/pruebas/users_seal/anonimo             143K  1.41T  99.3K  /oceano/pruebas/users_seal/anonimo
oceano/pruebas/users_seal/anonimo@backup1    43.5K      -  98.9K  -
oceano/pruebas/users_seal/anonimo@backup2        0      -  99.3K  -
oceano/pruebas/users_seal/antonio            20.3G  1.41T  20.2G  /oceano/pruebas/users_seal/antonio
oceano/pruebas/users_seal/antonio@backup1    4.74M      -  20.0G  -
oceano/pruebas/users_seal/antonio@backup2        0      -  20.2G  -
oceano/pruebas/users_seal/carlos             1.36G  1.41T  1.36G  /oceano/pruebas/users_seal/carlos
oceano/pruebas/users_seal/carlos@backup1         0      -  1.36G  -
oceano/pruebas/users_seal/carlos@backup2         0      -  1.36G  -
oceano/pruebas/users_seal/carmen             6.64G  1.41T  6.64G  /oceano/pruebas/users_seal/carmen
oceano/pruebas/users_seal/carmen@backup1         0      -  6.64G  -
oceano/pruebas/users_seal/carmen@backup2         0      -  6.64G  -
oceano/pruebas/users_seal/casanueva          1.29M  1.41T  1.29M  /oceano/pruebas/users_seal/casanueva
oceano/pruebas/users_seal/casanueva@backup1      0      -  1.29M  -
oceano/pruebas/users_seal/casanueva@backup2      0      -  1.29M  -
oceano/pruebas/users_seal/ccabrillo           551M  1.41T   551M  /oceano/pruebas/users_seal/ccabrillo
oceano/pruebas/users_seal/ccabrillo@backup1      0      -   551M  -
oceano/pruebas/users_seal/ccabrillo@backup2      0      -   551M  -
oceano/pruebas/users_seal/chus               7.21G  1.41T  7.21G  /oceano/pruebas/users_seal/chus
oceano/pruebas/users_seal/chus@backup1           0      -  7.21G  -
oceano/pruebas/users_seal/chus@backup2           0      -  7.21G  -
oceano/pruebas/users_seal/daniel             5.28G  1.41T  5.28G  /oceano/pruebas/users_seal/daniel
oceano/pruebas/users_seal/daniel@backup1         0      -  5.28G  -
oceano/pruebas/users_seal/daniel@backup2         0      -  5.28G  -
root@seal:~# zfs send -vR -i backup1 oceano/pruebas@backup2 | zfs recv -vFd depot
send from @backup1 to oceano/pruebas@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal@backup2 estimated size is 922K
send from @backup1 to oceano/pruebas/users_seal/carlos@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal/chus@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal/anonimo@backup2 estimated size is 10.5K
send from @backup1 to oceano/pruebas/users_seal/daniel@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal/casanueva@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal/antonio@backup2 estimated size is 293M
send from @backup1 to oceano/pruebas/users_seal/carmen@backup2 estimated size is 0
send from @backup1 to oceano/pruebas/users_seal/ccabrillo@backup2 estimated size is 0
total estimated size is 294M
TIME        SENT   SNAPSHOT
receiving incremental stream of oceano/pruebas@backup2 into depot/pruebas@backup2
TIME        SENT   SNAPSHOT
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal@backup2 into depot/pruebas/users_seal@backup2
20:56:08   4.83M   oceano/pruebas/users_seal@backup2
TIME        SENT   SNAPSHOT
received 5.10MB stream in 2 seconds (2.55MB/sec)
receiving incremental stream of oceano/pruebas/users_seal/carlos@backup2 into depot/pruebas/users_seal/carlos@backup2
TIME        SENT   SNAPSHOT
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal/chus@backup2 into depot/pruebas/users_seal/chus@backup2
TIME        SENT   SNAPSHOT
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal/anonimo@backup2 into depot/pruebas/users_seal/anonimo@backup2
TIME        SENT   SNAPSHOT
received 17.4KB stream in 1 seconds (17.4KB/sec)
receiving incremental stream of oceano/pruebas/users_seal/daniel@backup2 into depot/pruebas/users_seal/daniel@backup2
TIME        SENT   SNAPSHOT
TIME        SENT   SNAPSHOT
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal/casanueva@backup2 into depot/pruebas/users_seal/casanueva@backup2
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal/antonio@backup2 into depot/pruebas/users_seal/antonio@backup2
20:56:14   16.8K   oceano/pruebas/users_seal/antonio@backup2
20:56:15   94.3M   oceano/pruebas/users_seal/antonio@backup2
20:56:16    150M   oceano/pruebas/users_seal/antonio@backup2
20:56:17    286M   oceano/pruebas/users_seal/antonio@backup2
TIME        SENT   SNAPSHOT
TIME        SENT   SNAPSHOT
received 309MB stream in 5 seconds (61.9MB/sec)
receiving incremental stream of oceano/pruebas/users_seal/carmen@backup2 into depot/pruebas/users_seal/carmen@backup2
received 312B stream in 1 seconds (312B/sec)
receiving incremental stream of oceano/pruebas/users_seal/ccabrillo@backup2 into depot/pruebas/users_seal/ccabrillo@backup2
received 312B stream in 1 seconds (312B/sec)
root@seal:~# zfs list -rt all depot
NAME                                         USED  AVAIL  REFER  MOUNTPOINT
depot                                        265G   114T   347K  /depot
depot/pruebas                                265G   114T   347K  /depot/pruebas
depot/pruebas@backup1                       18.3K      -   347K  -
depot/pruebas@backup2                           0      -   347K  -
depot/pruebas/users_seal                     265G   114T   220G  /depot/pruebas/users_seal
depot/pruebas/users_seal@backup1             389M      -   220G  -
depot/pruebas/users_seal@backup2                0      -   220G  -
depot/pruebas/users_seal/anonimo             676K   114T   494K  /depot/pruebas/users_seal/anonimo
depot/pruebas/users_seal/anonimo@backup1     183K      -   494K  -
depot/pruebas/users_seal/anonimo@backup2        0      -   494K  -
depot/pruebas/users_seal/antonio            21.7G   114T  21.7G  /depot/pruebas/users_seal/antonio
depot/pruebas/users_seal/antonio@backup1    9.84M      -  21.3G  -
depot/pruebas/users_seal/antonio@backup2        0      -  21.7G  -
depot/pruebas/users_seal/carlos             1.49G   114T  1.49G  /depot/pruebas/users_seal/carlos
depot/pruebas/users_seal/carlos@backup1     18.3K      -  1.49G  -
depot/pruebas/users_seal/carlos@backup2         0      -  1.49G  -
depot/pruebas/users_seal/carmen             6.69G   114T  6.69G  /depot/pruebas/users_seal/carmen
depot/pruebas/users_seal/carmen@backup1     18.3K      -  6.69G  -
depot/pruebas/users_seal/carmen@backup2         0      -  6.69G  -
depot/pruebas/users_seal/casanueva          1.75M   114T  1.73M  /depot/pruebas/users_seal/casanueva
depot/pruebas/users_seal/casanueva@backup1  18.3K      -  1.73M  -
depot/pruebas/users_seal/casanueva@backup2      0      -  1.73M  -
depot/pruebas/users_seal/ccabrillo           898M   114T   898M  /depot/pruebas/users_seal/ccabrillo
depot/pruebas/users_seal/ccabrillo@backup1  18.3K      -   898M  -
depot/pruebas/users_seal/ccabrillo@backup2      0      -   898M  -
depot/pruebas/users_seal/chus               8.06G   114T  8.06G  /depot/pruebas/users_seal/chus
depot/pruebas/users_seal/chus@backup1       18.3K      -  8.06G  -
depot/pruebas/users_seal/chus@backup2           0      -  8.06G  -
depot/pruebas/users_seal/daniel             5.73G   114T  5.73G  /depot/pruebas/users_seal/daniel
depot/pruebas/users_seal/daniel@backup1     18.3K      -  5.73G  -
depot/pruebas/users_seal/daniel@backup2         0      -  5.73G  -
root@seal:~# zfs destroy -r depot/pruebas@backup1
root@seal:~# zfs destroy -r depot/pruebas@backup2
root@seal:~# zfs list -rt all depot
NAME                                 USED  AVAIL  REFER  MOUNTPOINT
depot                                264G   114T   347K  /depot
depot/pruebas                        264G   114T   347K  /depot/pruebas
depot/pruebas/users_seal             264G   114T   220G  /depot/pruebas/users_seal
depot/pruebas/users_seal/anonimo     494K   114T   494K  /depot/pruebas/users_seal/anonimo
depot/pruebas/users_seal/antonio    21.7G   114T  21.7G  /depot/pruebas/users_seal/antonio
depot/pruebas/users_seal/carlos     1.49G   114T  1.49G  /depot/pruebas/users_seal/carlos
depot/pruebas/users_seal/carmen     6.69G   114T  6.69G  /depot/pruebas/users_seal/carmen
depot/pruebas/users_seal/casanueva  1.73M   114T  1.73M  /depot/pruebas/users_seal/casanueva
depot/pruebas/users_seal/ccabrillo   898M   114T   898M  /depot/pruebas/users_seal/ccabrillo
depot/pruebas/users_seal/chus       8.06G   114T  8.06G  /depot/pruebas/users_seal/chus
depot/pruebas/users_seal/daniel     5.73G   114T  5.73G  /depot/pruebas/users_seal/daniel

Disk location in OpenIndiana

List all the disk configured in the system

root@seal:~# ls /dev/rdsk/*d0s0
/dev/rdsk/c2t5000CCA369C47080d0s0  /dev/rdsk/c2t5000CCA37DC31681d0s0  /dev/rdsk/c2t5000CCA37DC5BBAAd0s0  /dev/rdsk/c2t5000CCA37DC61D8Bd0s0
/dev/rdsk/c2t5000CCA369C4E90Bd0s0  /dev/rdsk/c2t5000CCA37DC350B2d0s0  /dev/rdsk/c2t5000CCA37DC5E18Ad0s0  /dev/rdsk/c2t5000CCA37DC61E25d0s0
/dev/rdsk/c2t5000CCA369C4F888d0s0  /dev/rdsk/c2t5000CCA37DC3519Ad0s0  /dev/rdsk/c2t5000CCA37DC5E303d0s0  /dev/rdsk/c2t5000CCA37DC6275Ed0s0
/dev/rdsk/c2t5000CCA369C5026Ed0s0  /dev/rdsk/c2t5000CCA37DC4B46Fd0s0  /dev/rdsk/c2t5000CCA37DC5E343d0s0  /dev/rdsk/c2t50014EE205B5C957d0s0
/dev/rdsk/c2t5000CCA369C50370d0s0  /dev/rdsk/c2t5000CCA37DC4B7BDd0s0  /dev/rdsk/c2t5000CCA37DC5F76Ad0s0  /dev/rdsk/c2t5E83A972B7F39C50d0s0
/dev/rdsk/c2t5000CCA369C504D1d0s0  /dev/rdsk/c2t5000CCA37DC4C814d0s0  /dev/rdsk/c2t5000CCA37DC5F831d0s0  /dev/rdsk/c2t5E83A974348B629Ad0s0
/dev/rdsk/c2t5000CCA369C505D5d0s0  /dev/rdsk/c2t5000CCA37DC4C877d0s0  /dev/rdsk/c2t5000CCA37DC5F9B1d0s0  /dev/rdsk/c3t0d0s0
/dev/rdsk/c2t5000CCA369C50609d0s0  /dev/rdsk/c2t5000CCA37DC4FD78d0s0  /dev/rdsk/c2t5000CCA37DC5F9FBd0s0  /dev/rdsk/c3t10d0s0
/dev/rdsk/c2t5000CCA369C50679d0s0  /dev/rdsk/c2t5000CCA37DC52AC1d0s0  /dev/rdsk/c2t5000CCA37DC5FAD2d0s0  /dev/rdsk/c3t11d0s0
/dev/rdsk/c2t5000CCA369C50680d0s0  /dev/rdsk/c2t5000CCA37DC53296d0s0  /dev/rdsk/c2t5000CCA37DC5FB35d0s0  /dev/rdsk/c3t12d0s0
/dev/rdsk/c2t5000CCA369C506AFd0s0  /dev/rdsk/c2t5000CCA37DC532C8d0s0  /dev/rdsk/c2t5000CCA37DC5FC4Ed0s0  /dev/rdsk/c3t13d0s0
/dev/rdsk/c2t5000CCA369C506BBd0s0  /dev/rdsk/c2t5000CCA37DC532CEd0s0  /dev/rdsk/c2t5000CCA37DC5FC58d0s0  /dev/rdsk/c3t14d0s0
/dev/rdsk/c2t5000CCA369C5084Bd0s0  /dev/rdsk/c2t5000CCA37DC53322d0s0  /dev/rdsk/c2t5000CCA37DC5FCA7d0s0  /dev/rdsk/c3t15d0s0
/dev/rdsk/c2t5000CCA369C508C9d0s0  /dev/rdsk/c2t5000CCA37DC53335d0s0  /dev/rdsk/c2t5000CCA37DC5FCB0d0s0  /dev/rdsk/c3t16d0s0
/dev/rdsk/c2t5000CCA369C508E0d0s0  /dev/rdsk/c2t5000CCA37DC5335Cd0s0  /dev/rdsk/c2t5000CCA37DC5FD7Cd0s0  /dev/rdsk/c3t17d0s0
/dev/rdsk/c2t5000CCA369C508E5d0s0  /dev/rdsk/c2t5000CCA37DC5336Ed0s0  /dev/rdsk/c2t5000CCA37DC5FDA4d0s0  /dev/rdsk/c3t18d0s0
/dev/rdsk/c2t5000CCA369C508ECd0s0  /dev/rdsk/c2t5000CCA37DC53370d0s0  /dev/rdsk/c2t5000CCA37DC5FDB8d0s0  /dev/rdsk/c3t19d0s0
/dev/rdsk/c2t5000CCA369C509ECd0s0  /dev/rdsk/c2t5000CCA37DC53385d0s0  /dev/rdsk/c2t5000CCA37DC5FEABd0s0  /dev/rdsk/c3t1d0s0
/dev/rdsk/c2t5000CCA369C50F1Fd0s0  /dev/rdsk/c2t5000CCA37DC53395d0s0  /dev/rdsk/c2t5000CCA37DC5FEB4d0s0  /dev/rdsk/c3t20d0s0
/dev/rdsk/c2t5000CCA369C51558d0s0  /dev/rdsk/c2t5000CCA37DC55121d0s0  /dev/rdsk/c2t5000CCA37DC5FEC1d0s0  /dev/rdsk/c3t21d0s0
/dev/rdsk/c2t5000CCA369C5177Bd0s0  /dev/rdsk/c2t5000CCA37DC55170d0s0  /dev/rdsk/c2t5000CCA37DC5FF06d0s0  /dev/rdsk/c3t22d0s0
/dev/rdsk/c2t5000CCA369C5178Fd0s0  /dev/rdsk/c2t5000CCA37DC56D06d0s0  /dev/rdsk/c2t5000CCA37DC5FF0Ad0s0  /dev/rdsk/c3t23d0s0
/dev/rdsk/c2t5000CCA369C5190Dd0s0  /dev/rdsk/c2t5000CCA37DC56D0Ad0s0  /dev/rdsk/c2t5000CCA37DC5FF35d0s0  /dev/rdsk/c3t24d0s0
/dev/rdsk/c2t5000CCA369C52E05d0s0  /dev/rdsk/c2t5000CCA37DC56D8Ed0s0  /dev/rdsk/c2t5000CCA37DC602FCd0s0  /dev/rdsk/c3t3d0s0
/dev/rdsk/c2t5000CCA369C54C04d0s0  /dev/rdsk/c2t5000CCA37DC56E70d0s0  /dev/rdsk/c2t5000CCA37DC6032Cd0s0  /dev/rdsk/c3t4d0s0
/dev/rdsk/c2t5000CCA369C554CAd0s0  /dev/rdsk/c2t5000CCA37DC56E8Dd0s0  /dev/rdsk/c2t5000CCA37DC60345d0s0  /dev/rdsk/c3t5d0s0
/dev/rdsk/c2t5000CCA369C55766d0s0  /dev/rdsk/c2t5000CCA37DC56F33d0s0  /dev/rdsk/c2t5000CCA37DC60368d0s0  /dev/rdsk/c3t6d0s0
/dev/rdsk/c2t5000CCA369C598A7d0s0  /dev/rdsk/c2t5000CCA37DC56FB5d0s0  /dev/rdsk/c2t5000CCA37DC60AA1d0s0  /dev/rdsk/c3t7d0s0
/dev/rdsk/c2t5000CCA369C59907d0s0  /dev/rdsk/c2t5000CCA37DC56FEFd0s0  /dev/rdsk/c2t5000CCA37DC60AD4d0s0  /dev/rdsk/c3t8d0s0
/dev/rdsk/c2t5000CCA369C59910d0s0  /dev/rdsk/c2t5000CCA37DC57008d0s0  /dev/rdsk/c2t5000CCA37DC60C21d0s0  /dev/rdsk/c3t9d0s0
/dev/rdsk/c2t5000CCA369C59954d0s0  /dev/rdsk/c2t5000CCA37DC57FCAd0s0  /dev/rdsk/c2t5000CCA37DC60CAFd0s0  /dev/rdsk/c5t10d0s0
/dev/rdsk/c2t5000CCA369C599ACd0s0  /dev/rdsk/c2t5000CCA37DC58190d0s0  /dev/rdsk/c2t5000CCA37DC60CB9d0s0  /dev/rdsk/c5t11d0s0
/dev/rdsk/c2t5000CCA369C5A374d0s0  /dev/rdsk/c2t5000CCA37DC581EBd0s0  /dev/rdsk/c2t5000CCA37DC60CCBd0s0  /dev/rdsk/c5t12d0s0
/dev/rdsk/c2t5000CCA369C5A407d0s0  /dev/rdsk/c2t5000CCA37DC58576d0s0  /dev/rdsk/c2t5000CCA37DC60D34d0s0  /dev/rdsk/c5t1d0s0
/dev/rdsk/c2t5000CCA369C5A409d0s0  /dev/rdsk/c2t5000CCA37DC5881Ad0s0  /dev/rdsk/c2t5000CCA37DC60DA0d0s0  /dev/rdsk/c5t2d0s0
/dev/rdsk/c2t5000CCA369C5A416d0s0  /dev/rdsk/c2t5000CCA37DC5889Bd0s0  /dev/rdsk/c2t5000CCA37DC60E7Cd0s0  /dev/rdsk/c5t3d0s0
/dev/rdsk/c2t5000CCA369C5A41Dd0s0  /dev/rdsk/c2t5000CCA37DC5891Dd0s0  /dev/rdsk/c2t5000CCA37DC60F61d0s0  /dev/rdsk/c5t4d0s0
/dev/rdsk/c2t5000CCA369C5A420d0s0  /dev/rdsk/c2t5000CCA37DC58926d0s0  /dev/rdsk/c2t5000CCA37DC60F80d0s0  /dev/rdsk/c5t5d0s0
/dev/rdsk/c2t5000CCA369C5A42Dd0s0  /dev/rdsk/c2t5000CCA37DC58956d0s0  /dev/rdsk/c2t5000CCA37DC60FBFd0s0  /dev/rdsk/c5t6d0s0
/dev/rdsk/c2t5000CCA369C5A432d0s0  /dev/rdsk/c2t5000CCA37DC59312d0s0  /dev/rdsk/c2t5000CCA37DC6172Dd0s0  /dev/rdsk/c5t7d0s0
/dev/rdsk/c2t5000CCA369C5C19Ad0s0  /dev/rdsk/c2t5000CCA37DC59433d0s0  /dev/rdsk/c2t5000CCA37DC619DAd0s0  /dev/rdsk/c5t8d0s0
/dev/rdsk/c2t5000CCA369D347CEd0s0  /dev/rdsk/c2t5000CCA37DC59489d0s0  /dev/rdsk/c2t5000CCA37DC61D6Dd0s0  /dev/rdsk/c5t9d0s0
/dev/rdsk/c2t5000CCA37DC28966d0s0  /dev/rdsk/c2t5000CCA37DC59DAEd0s0  /dev/rdsk/c2t5000CCA37DC61D7Dd0s0

Inquiry device info

root@seal:~# sg_inq /dev/rdsk/c2t5000CCA369C47080d0s0
standard INQUIRY:
  PQual=0  Device_type=0  RMB=0  version=0x06  [SPC-4]
  [AERC=0]  [TrmTsk=0]  NormACA=0  HiSUP=1  Resp_data_format=2
  SCCS=0  ACC=0  TPGS=0  3PC=0  Protect=0  [BQue=0]
  EncServ=0  MultiP=0  [MChngr=0]  [ACKREQQ=0]  Addr16=0
  [RelAdr=0]  WBus16=0  Sync=0  Linked=0  [TranDis=0]  CmdQue=1
  [SPI: Clocking=0x0  QAS=0  IUS=0]
    length=74 (0x4a)   Peripheral device type: disk
 Vendor identification: ATA
 Product identification: Hitachi HDS72302
 Product revision level: A580
 Unit serial number:       MN1220F309SS9D

Identification info

root@seal:~# sg_inq -u /dev/rdsk/c2t5000CCA369C47080d0s0
SCSI_IDENT_LUN_NAA=5000cca369c47080
SCSI_IDENT_PORT_NAA=5003048001155a4d
SCSI_IDENT_PORT_RELATIVE=1

List system expanders

root@seal:~# ls /dev/es/*
/dev/es/ses0  /dev/es/ses10  /dev/es/ses12  /dev/es/ses14  /dev/es/ses2  /dev/es/ses4  /dev/es/ses6  /dev/es/ses8
/dev/es/ses1  /dev/es/ses11  /dev/es/ses13  /dev/es/ses15  /dev/es/ses3  /dev/es/ses5  /dev/es/ses7  /dev/es/ses9

Look for the SES device 1155a

root@seal:~# sg_inq -u /dev/es/ses2
SCSI_IDENT_PORT_NAA=5003048001155a7d
SCSI_IDENT_LUN_NAA=5003048001155a7d

indentify SES device

root@seal:~# sg_ses /dev/es/ses2
  LSI CORP  SAS2X36           0717
Supported diagnostic pages:
  Supported Diagnostic Pages [sdp] [0x0]
  Configuration (SES) [cf] [0x1]
  Enclosure Status/Control (SES) [ec,es] [0x2]
  Element Descriptor (SES) [ed] [0x7]
  Additional Element Status (SES-2) [aes] [0xa]
  Download Microcode (SES-2) [dm] [0xe]

SES elements descriptions

root@seal:~# sg_ses -p ed /dev/es/ses2
  LSI CORP  SAS2X36           0717
  Primary enclosure logical identifier (hex): 5003048001155a7f
Element Descriptor In diagnostic page:
  generation code: 0x0
  element descriptor by type list
    Element type: Array device slot, subenclosure id: 0 [ti=0]
      Overall descriptor: Drive Slots
      Element 0 descriptor: Slot 01
      Element 1 descriptor: Slot 02
      Element 2 descriptor: Slot 03
      Element 3 descriptor: Slot 04
      Element 4 descriptor: Slot 05
      Element 5 descriptor: Slot 06
      Element 6 descriptor: Slot 07
      Element 7 descriptor: Slot 08
      Element 8 descriptor: Slot 09
      Element 9 descriptor: Slot 10
      Element 10 descriptor: Slot 11
      Element 11 descriptor: Slot 12
      Element 12 descriptor: Slot 13
      Element 13 descriptor: Slot 14
      Element 14 descriptor: Slot 15
      Element 15 descriptor: Slot 16
      Element 16 descriptor: Slot 17
      Element 17 descriptor: Slot 18
      Element 18 descriptor: Slot 19
      Element 19 descriptor: Slot 20
      Element 20 descriptor: Slot 21
      Element 21 descriptor: Slot 22
      Element 22 descriptor: Slot 23
      Element 23 descriptor: Slot 24
    Element type: Temperature sensor, subenclosure id: 0 [ti=1]
      Overall descriptor: Temperature Sensors
      Element 0 descriptor: Temperature
    Element type: Cooling, subenclosure id: 0 [ti=2]
      Overall descriptor: Fans
      Element 0 descriptor: Fan1
      Element 1 descriptor: Fan2
      Element 2 descriptor: Fan3
      Element 3 descriptor: JBOD_Fan1
      Element 4 descriptor: JBOD_Fan2
    Element type: Audible alarm, subenclosure id: 0 [ti=3]
      Overall descriptor: Buzzers
      Element 0 descriptor: Buzzer
    Element type: Voltage sensor, subenclosure id: 0 [ti=4]
      Overall descriptor: Voltage Sensors
      Element 0 descriptor: 5V
      Element 1 descriptor: 12V
    Element type: Current sensor, subenclosure id: 0 [ti=5]
      Overall descriptor: Current Sensors
      Element 0 descriptor: 5V Line Current Sensor
      Element 1 descriptor: 12V Line Current Sensor
    Element type: Power supply, subenclosure id: 0 [ti=6]
      Overall descriptor: Power Supplies
      Element 0 descriptor: Power Supply 1
      Element 1 descriptor: Power Supply 2
    Element type: Enclosure, subenclosure id: 0 [ti=7]
      Overall descriptor: Enclosure
      Element 0 descriptor: Enclosure
    Element type: SAS expander, subenclosure id: 0 [ti=8]
      Overall descriptor: SAS Expanders
      Element 0 descriptor: Primary Expander
      Element 1 descriptor: Secondary Expander
    Element type: SAS connector, subenclosure id: 0 [ti=9]
      Overall descriptor: SAS Connectors
      Element 0 descriptor: Upstream Connector (Primary)
      Element 1 descriptor: Downstream Connector 1 (Primary)
      Element 2 descriptor: Downstream Connector 2 (Primary)
      Element 3 descriptor: Upstream Connector (Secondary)
      Element 4 descriptor: Downstream Connector 1 (Secondary)
      Element 5 descriptor: Downstream Connector 2 (Secondary)
      Element 6 descriptor: Drive Connector 00
      Element 7 descriptor: Drive Connector 01
      Element 8 descriptor: Drive Connector 02
      Element 9 descriptor: Drive Connector 03
      Element 10 descriptor: Drive Connector 04
      Element 11 descriptor: Drive Connector 05
      Element 12 descriptor: Drive Connector 06
      Element 13 descriptor: Drive Connector 07
      Element 14 descriptor: Drive Connector 08
      Element 15 descriptor: Drive Connector 09
      Element 16 descriptor: Drive Connector 10
      Element 17 descriptor: Drive Connector 11
      Element 18 descriptor: Drive Connector 12
      Element 19 descriptor: Drive Connector 13
      Element 20 descriptor: Drive Connector 14
      Element 21 descriptor: Drive Connector 15
      Element 22 descriptor: Drive Connector 16
      Element 23 descriptor: Drive Connector 17
      Element 24 descriptor: Drive Connector 18
      Element 25 descriptor: Drive Connector 19
      Element 26 descriptor: Drive Connector 20
      Element 27 descriptor: Drive Connector 21
      Element 28 descriptor: Drive Connector 22
      Element 29 descriptor: Drive Connector 23
    Element type: Communication port, subenclosure id: 0 [ti=10]
      Overall descriptor: Ethernet ports
      Element 0 descriptor: Ethernet_port_1
      Element 1 descriptor: Ethernet_port_2

and look for slot (iterate over slots)

root@seal:~# sg_ses -I 0,1 -p aes /dev/es/ses2
  LSI CORP  SAS2X36           0717
  Primary enclosure logical identifier (hex): 5003048001155a7f
Additional element status diagnostic page:
  generation code: 0x0
  additional element status descriptor list
      Element index: 1
        Transport protocol: SAS
        number of phys: 1, not all phys: 0, device slot number: 1
        phy index: 0
          device type: no device attached
          initiator port for:
          target port for: SATA_device
          attached SAS address: 0x5003048001155a7f
          SAS address: 0x5003048001155a4d
          phy identifier: 0x0

and the drive is in the Slot 2 of the SES enclosure /dev/ses2 (Front backplane JBOD1)

switch on Slot's locate led

root@seal:~# sg_ses -I 0,1 --set=locate /dev/es/ses2

query locate led status

root@seal:~# sg_ses -I 0,1 --get=locate /dev/es/ses2
1

switch off locate led

root@seal:~# sg_ses -I 0,1 --clear=locate /dev/es/ses2

query locate led status

root@seal:~# sg_ses -I 0,1 --get=locate /dev/es/ses2
0

Removing the ZIL

Currently our ZFS pool is using ZIL based on a file created on the rpool which consisting in 2 mirrored SAS drives (15k RPM, 3Gb/s).

With this configuration the pool is able to serve ~8000 synchronized ops per second. Because the this problematic configuration (importing the pool), we have removed the ZIL and the has fallen dramtically to no more than ~100 ops. Therefore the performance of the system is poorly working.

Usage: zilstat.ksh [gMt][-l linecount] [-p poolname] [interval [count]]
    -M  # print numbers as megabytes (base 10)
    -t  # print timestamp
    -p poolname      # only look at poolname
    -l linecount    # print header every linecount lines (default=only once)
    interval in seconds or "txg" for transaction group commit intervals
             note: "txg" only appropriate when -p poolname is used
    count will limit the number of intervals reported

    examples:
        zilstat.ksh # default output, 1 second samples
        zilstat.ksh 10  # 10 second samples
        zilstat.ksh 10 6    # print 6 x 10 second samples
        zilstat.ksh -p rpool    # show ZIL stats for rpool only

    output:
        [TIME]
        N-Bytes    - data bytes written to ZIL over the interval
        N-Bytes/s  - data bytes/s written to ZIL over ther interval
        N-Max-Rate - maximum data rate during any 1-second sample
        B-Bytes    - buffer bytes written to ZIL over the interval
        B-Bytes/s  - buffer bytes/s written to ZIL over ther interval
        B-Max-Rate - maximum buffer rate during any 1-second sample
        ops        - number of synchronous iops per interval
        <=4kB      - number of synchronous iops <= 4kBytes per interval
        4-32kB     - number of synchronous iops 4-32kBytes per interval
        >=32kB     - number of synchronous iops >= 32kBytes per interval
    note: data bytes are actual data, total bytes counts buffer size
root@seal.macc.unican.es:~# ./zilstat.ksh -M -p oceano
      N-MB     N-MB/s N-Max-Rate       B-MB     B-MB/s B-Max-Rate    ops  <=4kB 4-32kB >=32kB
         2          2          2          5          5          5     66     23      0     43
        81         81         81        155        155        155   1205     21      0   1184
       177        177        177        355        355        355   2734     20      0   2714
        66         66         66        134        134        134   1048     25      0   1023
        13         13         13         26         26         26    234     33      0    201
         0          0          0          0          0          0      1      1      0      0
         0          0          0          0          0          0      0      0      0      0
         0          0          0          0          0          0      0      0      0      0
         0          0          0          0          0          0      0      0      0      0
        52         52         52         96         96         96    740      2      0    738
        10         10         10         21         21         21    183     20      0    163
         5          5          5         12         12         12    122     30      0     92
         0          0          0          2          2          2     29     10      0     19
       275        275        275        517        517        517   3973     29      0   3944
       266        266        266        500        500        500   3833     16      0   3818
        12         12         12         25         25         25    214     20      0    194
        65         65         65        123        123        123    968     26      0    942
       209        209        209        399        399        399   3064     16      0   3048
        63         63         63        120        120        120    950     29      0    921
        60         60         60        112        112        112    875     21      0    854
         2          2          2          7          7          7     87     34      0     53
         2          2          2          7          7          7     85     30      0     55
         2          2          2          7          7          7     84     30      0     54
        60         60         60        113        113        113    888     25      0    863
         2          2          2          6          6          6     75     28      0     47
         3          3          3          8          8          8     93     32      0     61
        60         60         60        115        115        115    909     31      0    878
         2          2          2          7          7          7     94     35      4     55
        28         28         28         56         56         56    460     31      1    428
        35         35         35         65         65         65    538     36      2    500
         2          2          2          6          6          6     71     26      0     45
         3          3          3          8          8          8     97     34      0     63
        61         61         61        114        114        114    904     27      1    876
        24         24         24         49         49         49    394     20      0    374
         2          2          2          7          7          7     84     30      0     54
         3          3          3          8          8          8     98     35      0     63
         2          2          2          7          7          7     85     30      0     55
         2          2          2          7          7          7     87     32      0     55
       540        540        540       1078       1078       1078   8248     23      0   8223
        60         60         60        114        114        114    891     21      0    870
.............................................................................................
         0          0          0          0          0          0     55     45     10      0
         0          0          0          0          0          0     53     50      1      2
         0          0          0          0          0          0     56     56      0      0
         0          0          0          0          0          0     48     47      0      1
         0          0          0          0          0          0     54     48      0      6
         0          0          0          0          0          0     68     64      0      4
         1          1          1          2          2          2     55     38      0     17
         0          0          0          0          0          0     67     67      0      0
         0          0          0          0          0          0     43     40      2      1
         0          0          0          0          0          0     52     48      2      2
         0          0          0          0          0          0     62     49     13      0
         0          0          0          0          0          0     49     49      0      0
         0          0          0          0          0          0     21     21      0      0
         0          0          0          0          0          0     19     11      0      8
         0          0          0          0          0          0     25     24      0      1
         0          0          0          0          0          0     19     16      3      0
         0          0          0          0          0          0     17     17      0      0
         0          0          0          0          0          0     19     19      0      0
         0          0          0          0          0          0     54     54      0      0
         0          0          0          1          1          1     17      9      0      8
         0          0          0          1          1          1     46     36      0     10
         0          0          0          0          0          0     47     39      2      6
         0          0          0          0          0          0     67     66      0      1
         0          0          0          0          0          0     18     15      3      0
         0          0          0          0          0          0     12     12      0      0
         0          0          0          1          1          1     22     14      0      8
         0          0          0          0          0          0     22     15      0      7
         0          0          0          1          1          1     28     18      2      8
         0          0          0          0          0          0     18     11      0      7
         0          0          0          0          0          0     16      6      3      7
         0          0          0          0          0          0     26     18      2      6
         0          0          0          0          0          0     33     33      0      0

SSD drives for ZIL and L2ARC cache on the ZFS

We have ......

  • Posted: 2012-08-01 14:13
  • Author: antonio
  • Categories: (none)
  • Comments (239)

How the solaris mpt_sas driver name SATA devices

With the the SAS2 LSI driver for Solaris (mpt_sas) the device names for SATA drivesattached to SAS ports are a bit confusing.

Apparentliy the main reason for this confusing naming is because the mpt_sas driver support device multi path (MPXIO). Therefore it needs some identification mechanism resistent to changes on the SAS topology. For a SAS device this is be done using the SAS WWN but with SATA devices it can be done using the GUID for the SATA device.

From the source code for the OpenSolaris's mpt_sas driver I have discovered that the driver is inquiryng something called 'page 83'

13091 uint64_t mptsas_get_sata_guid(mptsas_t *mpt, mptsas_target_t *ptgt, int lun)
13092 {
13093         uint64_t        sata_guid = 0, *pwwn = NULL;
13094         int             target = ptgt->m_devhdl;
13095         uchar_t         *inq83 = NULL;
13096         int             inq83_len = 0xFF;
13097         uchar_t         *dblk = NULL;
13098         int             inq83_retry = 3;
13099         int             rval = DDI_FAILURE;
13100 
13101         inq83   = kmem_zalloc(inq83_len, KM_SLEEP);
13102 
13103 inq83_retry:
13104         rval = mptsas_inquiry(mpt, ptgt, lun, 0x83, inq83,
13105             inq83_len, NULL, 1);
13106         if (rval != DDI_SUCCESS) {
13107                 mptsas_log(mpt, CE_WARN, "!mptsas request inquiry page "
13108                     "0x83 for target:%x, lun:%x failed!", target, lun);
13109                 goto out;
13110         }
13111         /* According to SAT2, the first descriptor is logic unit name */
13112         dblk = &inq83[4];
13113         if ((dblk[1] & 0x30) != 0) {
13114                 mptsas_log(mpt, CE_WARN, "!Descriptor is not lun associated.");
13115                 goto out;
13116         }
13117         pwwn = (uint64_t *)(void *)(&dblk[4]);
13118         if ((dblk[4] & 0xf0) == 0x50) {
13119                 sata_guid = BE_64(*pwwn);
13120                 goto out;
13121         } else if (dblk[4] == 'A') {
13122                 NDBG20(("SATA drive has no NAA format GUID."));
13123                 goto out;
13124         } else {
13125                 /* The data is not ready, wait and retry */
13126                 inq83_retry--;
13127                 if (inq83_retry <= 0) {
13128                         goto out;
13129                 }
13130                 NDBG20(("The GUID is not ready, retry..."));
13131                 delay(1 * drv_usectohz(1000000));
13132                 goto inq83_retry;
13133         }
13134 out:
13135         kmem_free(inq83, inq83_len);
13136         return (sata_guid);
13137 }

As exercise I will try to discover this information for the following device

root@seal.macc.unican.es:~# prtconf -v /dev/dsk/c10t5000CCA221C25B1Ed0
disk, instance #217
    Driver properties:
        name='inquiry-serial-no' type=string items=1 dev=none
            value='JK1133YAG55NLU'
        name='pm-components' type=string items=3 dev=none
            value='NAME=spindle-motor' + '0=off' + '1=on'
        name='pm-hardware-state' type=string items=1 dev=none
            value='needs-suspend-resume'
        name='ddi-failfast-supported' type=boolean dev=none
        name='ddi-kernel-ioctl' type=boolean dev=none
        name='fm-ereport-capable' type=boolean dev=none
        name='device-nblocks' type=int64 items=1 dev=none
            value=00000000e8e088b0
        name='device-blksize' type=int items=1 dev=none
            value=00000200
    Hardware properties:
        name='devid' type=string items=1
            value='id1,sd@n5000cca221c25b1e'
        name='inquiry-device-type' type=int items=1
            value=00000000
        name='inquiry-revision-id' type=string items=1
            value='JKAOA20N'
        name='inquiry-product-id' type=string items=1
            value='HDS722020ALA330'
        name='inquiry-vendor-id' type=string items=1
            value='Hitachi'
        name='class' type=string items=1
            value='scsi'
        name='obp-path' type=string items=1
            value='/pci@7a,0/pci8086,340e@7/pci1000,3080@0/disk@w5000cca221c25b1e,0'
        name='pm-capable' type=int items=1
            value=00000001
        name='guid' type=string items=1
            value='5000cca221c25b1e'
        name='sas-mpt' type=boolean
        name='port-wwn' type=byte items=8
            value=50.00.cc.a2.21.c2.5b.1e
        name='target-port' type=string items=1
            value='5000cca221c25b1e'
        name='compatible' type=string items=4
            value='scsiclass,00.vATA.pHitachi_HDS72202.rA20N' + 'scsiclass,00.vATA.pHitachi_HDS72202' + 'scsiclass,00' + 'scsiclass'
        name='lun' type=int items=1
            value=00000000

We can inquiry the ´83h´ (section 10.3.4 on attached document) page using the LSIUTL tool (version 1.63)

root@seal.macc.unican.es:~/LSIUTIL# ./lsiutil

LSI Logic MPT Configuration Utility, Version 1.63, June 4, 2009

5 MPT Ports found

     Port Name         Chip Vendor/Type/Rev    MPT Rev  Firmware Rev  IOC
 1.  mpt1              LSI Logic SAS1068E B3     105      011e0000     0
 2.  mpt2              LSI Logic SAS1068E B3     105      011e0000     0
 3.  mpt_sas0          LSI Logic SAS2008 03      200      0b000000     0
 4.  mpt_sas1          LSI Logic SAS2008 03      200      0b000000     0
 5.  mpt_sas6          LSI Logic SAS2008 02      200      0a000200     0

Select a device:  [1-5 or 0 to quit] 3

 1.  Identify firmware, BIOS, and/or FCode
 2.  Download firmware (update the FLASH)
 4.  Download/erase BIOS and/or FCode (update the FLASH)
 8.  Scan for devices
10.  Change IOC settings (interrupt coalescing)
13.  Change SAS IO Unit settings
16.  Display attached devices
20.  Diagnostics
21.  RAID actions
23.  Reset target
42.  Display operating system names for devices
43.  Diagnostic Buffer actions
45.  Concatenate SAS firmware and NVDATA files
59.  Dump PCI config space
60.  Show non-default settings
61.  Restore default settings
66.  Show SAS discovery errors
69.  Show board manufacturing information
97.  Reset SAS link, HARD RESET
98.  Reset SAS link
99.  Reset port
 e   Enable expert mode in menus
 p   Enable paged mode
 w   Enable logging

Main menu, select an option:  [1-99 or e/p/w or 0 to quit] e

Main menu, select an option:  [1-99 or e/p/w or 0 to quit] 20

 1.  Inquiry Test
 2.  WriteBuffer/ReadBuffer/Compare Test
 3.  Read Test
 4.  Write/Read/Compare Test
 5.  Write Test
 6.  Read/Compare Test
 7.  Log Sense Test
 8.  Read Capacity / Read Block Limits Test
 9.  Mode Page Test
10.  SATA Identify Device Test
11.  SATA Clear Affiliation Test
12.  Display phy counters
13.  Clear phy counters
14.  SATA SMART Read Test
15.  SEP (SCSI Enclosure Processor) Test
16.  Issue product-specific SAS IO Unit Control
17.  Diag data upload
18.  Report LUNs Test
19.  Drive firmware download
20.  Expander firmware download
21.  Read Logical Blocks
22.  Write Logical Blocks
23.  Verify Logical Blocks
24.  Read Buffer (for firmware upload)
25.  Display Expander Log entries
26.  Clear (erase) Expander Log entries
29.  Diagnostic Page Test
30.  Inject media error
31.  Repair media error
32.  Set software write protect
33.  Clear software write protect
34.  Enable read cache
35.  Disable read cache
36.  Enable write cache
37.  Disable write cache
98.  Reset expander
99.  Reset port
 e   Disable expert mode in menus
 p   Enable paged mode
 w   Enable logging

Diagnostics menu, select an option:  [1-99 or e/p/w or 0 to quit] 1

Bus:  [0-2 or RETURN to quit] 0
Target:  [0-255 or RETURN to quit] 31
LUN:  [0-255 or RETURN to quit] 0
VPD Page:  [00-FF or RETURN for normal Inquiry] 83

 B___T___L  Page
 0  31   0   83

36 bytes of Inquiry Data returned

0000 : 00 83 00 20 01 03 00 08 50 00 cc a2 21 c2 5b 1e            P   ! [
0010 : 61 93 00 08 50 03 04 80 01 15 5a 61 01 14 00 04    a   P     Za
0020 : 00 00 00 00

and the bytes 9-16 corresponds to the device name ´/dev/dsk/c10t5000CCA221C25B1Ed0´ given by the mpt_sas driver we can check that serial number corresponds to this drive inquirying the page ´80h´ (section 10.3.3 on the attached document)

Bus:  [0-2 or RETURN to quit] 0
Target:  [0-255 or RETURN to quit] 31
LUN:  [0-255 or RETURN to quit] 0
VPD Page:  [00-FF or RETURN for normal Inquiry] 80

 B___T___L  Page
 0  31   0   80

24 bytes of Inquiry Data returned

0000 : 00 80 00 14 20 20 20 20 20 20 4a 4b 31 31 33 33              JK1133
0010 : 59 41 47 35 35 4e 4c 55                            YAG55NLU

  • Posted: 2012-07-29 19:04 (Updated: 2012-07-29 19:15)
  • Author: antonio
  • Categories: (none)
  • Comments (475)

Gestion de los nodos usando IMPMItool

Para poder gestionar los nodos en linea de comandos, nos basta con conectarnos a NAT. y desde allí ejecutar el siguiente comando para conocer el estado del nodo (wn014)

[root@nat ~]# ipmitool -H 192.168.200.24 -U ADMIN power status
Password:
Chassis Power is on

y para rstearlo usamos el siguiente comando

[root@nat ~]# ipmitool -H 192.168.200.24 -U ADMIN chassis power reset
Password:
Chassis Power Control: Reset

Comprobación de la saturación del RPCIOd desde SEAL

A veces os ha pasado que el las operaciones de borrado de directorios directamente sobre seal son más lentas que si las haceis sobre NFS.

Esto sucede cuando seal está siendo "atacado" por NFS originando un DoS, ya que las operaciones NFS al ser un modulo del kernel tienen mayor prioridad que las operaciones iniciadas por el usuario.

Después de mucho leer y buscar la mejor forma de monitorear que es lo que está sucendiendo he podido diagnisticar el problema, o al menos creo haberlo hecho.

Para ello he usado la herramienta de monitorización que trae Solaris de forma nativa (también Mac OS X y FreeBSD), y la verdad es que es una herramienta muy potente sin parangón en Linux. Hay un libro que tienen un montón de ejmplos y que está adjunto a esta incidencia

El DoS del NFS parace venir originado por los RPCIOD relacionados con el NFS, y parace que es un bug en la implementación NFS de los clientes Linux (falta una referencia). Cuando seal se apaga, los clientes se quedan esperando a que vuelva el servidor, pero a veces los clientes se quedan colgados y los RPCIOD empienzan a consumir CPU. Parace que los procesos se han quedado en un bucle, en el cual hacen peticiones al servidor NFS, consumiendo ancho de banda y recursos del servidor sin hacer en realidad ninguna operacion de lectura/escritura.

Para detectar que hay algún cliente con el RPCIOD descontrolado en seal, podemos usar el DTrace desde seal. Para ello contamos la cantidad de operaciones de NFSV4 que se están realizando.

root@seal.macc.unican.es:~# time dtrace -n 'nfsv4::: { @[probename] = count(); }'
dtrace: description 'nfsv4::: ' matched 81 probes
^C

  op-read-done                                                      1
  op-read-start                                                     1
  op-setattr-done                                                   3
  op-setattr-start                                                  3
  op-access-done                                                    7
  op-access-start                                                   7
  op-commit-done                                                   13
  op-commit-start                                                  13
  op-close-done                                                    19
  op-close-start                                                   19
  op-open-done                                                     31
  op-open-start                                                    31
  op-restorefh-done                                                31
  op-restorefh-start                                               31
  op-savefh-done                                                   31
  op-savefh-start                                                  31
  op-lookup-done                                                 1018
  op-lookup-start                                                1018
  op-getfh-done                                                  1049
  op-getfh-start                                                 1049
  op-getattr-start                                               2367
  op-getattr-done                                                2369
  op-renew-done                                                  3233
  op-renew-start                                                 3233
  op-write-done                                                403884
  op-write-start                                               403884
  op-putfh-done                                                406206
  op-putfh-start                                               406207
  compound-start                                               409440
  compound-done                                                409441

real    0m15.303s
user    0m0.421s
sys     0m0.394s

el comando dtrace va precedido de un time, ya que el comando solo nos devuelve las estadisticas una vez que pulsamos CTRL+C. Con el time saberemos cuanto tiempo ha estado el dtrace recolectando datos.

En este caso en los 15 segundos que ha durado la captura ha habido más de 400k ops sobre el servidor NFS de seal, que son excesivas.

Lo siguiente es averiguar que cliente NFS está generando todas esas operaciones, para ello usamos otro comando dtrace

root@seal.macc.unican.es:~# time dtrace -n 'nfsv4:::compound-start { @[args[0]->ci_remote] = count(); }'
dtrace: description 'nfsv4:::compound-start ' matched 1 probe
^C

  192.168.202.43                                                    1
  192.168.202.44                                                   33
  192.168.202.131                                                  48
  192.168.202.45                                                   71
  193.144.184.29                                                 1914
  192.168.202.133                                              332377

real    0m13.027s
user    0m0.432s
sys     0m0.372s

y está claro que el cliente 192.168.202.133 (ce01) está realizando demasiadas operaciones (sobre todo son sobre home_grid)

si comprobamos el consumo de CPU en CE01, observamos que este está consumiendo una gran cantidad de CPU

[antonio@ce01 ~]$ top -b
top - 21:09:10 up 1 day, 20:12,  1 user,  load average: 0.60, 0.46, 0.84
Tasks: 250 total,   2 running, 248 sleeping,   0 stopped,   0 zombie
Cpu(s):  2.3% us,  1.2% sy,  0.0% ni, 95.6% id,  0.7% wa,  0.0% hi,  0.1% si
Mem:   1536000k total,  1314904k used,   221096k free,   138524k buffers
Swap:  7335664k total,        0k used,  7335664k free,   544060k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
 4798 root      10  -5     0    0    0 S 11.3  0.0   1:22.41 rpciod/0
    1 root      15   0  1644  544  468 S  0.0  0.0   0:00.64 init
    2 root      RT  -5     0    0    0 S  0.0  0.0   0:00.00 migration/0
    3 root      34  19     0    0    0 S  0.0  0.0   0:03.40 ksoftirqd/0
    4 root      RT  -5     0    0    0 S  0.0  0.0   0:00.00 watchdog/0
    5 root      10  -5     0    0    0 S  0.0  0.0   0:00.01 events/0
    6 root      14  -5     0    0    0 S  0.0  0.0   0:00.00 khelper
    7 root      20  -5     0    0    0 S  0.0  0.0   0:11.07 kthread
    9 root      10  -5     0    0    0 S  0.0  0.0   0:00.00 xenwatch
   10 root      10  -5     0    0    0 S  0.0  0.0   0:00.00 xenbus
   17 root      10  -5     0    0    0 S  0.0  0.0   0:00.00 kblockd/0
   18 root      20  -5     0    0    0 S  0.0  0.0   0:00.00 cqueue/0
   22 root      20  -5     0    0    0 S  0.0  0.0   0:00.00 khubd
   24 root      10  -5     0    0    0 S  0.0  0.0   0:00.00 kseriod
   82 root      25   0     0    0    0 S  0.0  0.0   0:00.00 pdflush
   83 root      15   0     0    0    0 S  0.0  0.0   0:01.43 pdflush
   84 root      20  -5     0    0    0 S  0.0  0.0   0:00.00 kswapd0
   85 root      20  -5     0    0    0 S  0.0  0.0   0:00.00 aio/0
  215 root      11  -5     0    0    0 S  0.0  0.0   0:00.00 kpsmoused

así que el cliente NFS de CE01 se ha quedado atorado y debido a la 'idiosincrasia' de NFS hace necesario reiniciarlo para que no atore el NFS de SEAL. Como es norma, el NFS bloque el reinicio haciendo necesario apagar la máquina mediente un reseteo.

Comprobamos que ha pasado con las ops sobre el servidor NFS

root@seal.macc.unican.es:~# time dtrace -n 'nfsv4::: { @[probename] = count(); }'
dtrace: description 'nfsv4::: ' matched 81 probes
^C

  op-open-downgrade-done                                            1
  op-open-downgrade-start                                           1
  op-setclientid-confirm-done                                       1
  op-setclientid-confirm-start                                      1
  op-setclientid-done                                               1
  op-setclientid-start                                              1
  null-done                                                         2
  null-start                                                        2
  op-putrootfh-done                                                 2
  op-putrootfh-start                                                2
  op-renew-done                                                     3
  op-renew-start                                                    3
  op-open-confirm-done                                             12
  op-open-confirm-start                                            12
  op-rename-done                                                   20
  op-rename-start                                                  20
  op-commit-done                                                   30
  op-commit-start                                                  30
  op-create-done                                                   32
  op-create-start                                                  32
  op-readdir-done                                                  40
  op-readdir-start                                                 40
  op-link-done                                                     58
  op-link-start                                                    58
  op-remove-done                                                   86
  op-remove-start                                                  86
  op-setattr-done                                                  90
  op-setattr-start                                                 90
  op-write-done                                                    93
  op-write-start                                                   93
  op-access-done                                                  184
  op-access-start                                                 184
  op-open-done                                                    284
  op-open-start                                                   284
  op-close-done                                                   303
  op-close-start                                                  303
  op-restorefh-done                                               384
  op-restorefh-start                                              384
  op-savefh-done                                                  394
  op-savefh-start                                                 394
  op-lookup-done                                                 1833
  op-lookup-start                                                1833
  op-getfh-done                                                  2008
  op-getfh-start                                                 2008
  op-getattr-done                                                8778
  op-getattr-start                                               8779
  op-read-done                                                  14310
  op-read-start                                                 14310
  compound-done                                                 23015
  compound-start                                                23016
  op-putfh-done                                                 23088
  op-putfh-start                                                23088

real    0m22.732s
user    0m0.492s
sys     0m0.345s

Creo que siguen siendo altas, así que comprobemos que cliente las origina

root@seal.macc.unican.es:~# time dtrace -n 'nfsv4:::compound-start { @[args[0]->ci_remote] = count(); }'
dtrace: description 'nfsv4:::compound-start ' matched 1 probe
^C

  192.168.202.43                                                   71
  192.168.202.44                                                   72
  192.168.202.131                                                  76
  192.168.202.133                                                 109
  192.168.202.15                                                 2517
  193.144.184.29                                                 9280

real    0m24.262s
user    0m0.420s
sys     0m0.371s

Parece que es oceano, así que veamos que es lo que está sucediendo

[antonio@oceano ~]$ top
top - 21:41:24 up 10 days,  3:25,  4 users,  load average: 9.82, 6.91, 4.41
Tasks: 306 total,   1 running, 305 sleeping,   0 stopped,   0 zombie
Cpu(s):  0.8%us,  1.0%sy,  0.0%ni, 85.0%id, 12.1%wa,  0.1%hi,  1.1%si,  0.0%st
Mem:  24557908k total, 24434096k used,   123812k free,    75000k buffers
Swap: 37552112k total,      160k used, 37551952k free, 18610848k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
21069 daniel    15   0 61292 3108 2172 S  5.0  0.0   0:51.87 sftp-server
 4018 antonio   22   0 6652m 394m  10m S  3.0  1.6  24:01.74 java
22398 daniel    15   0 61280 3072 2176 S  1.3  0.0   0:10.23 sftp-server
 3406 root      10  -5     0    0    0 S  0.7  0.0   9:00.22 rpciod/7
21019 daniel    15   0 61288 3108 2172 S  0.7  0.0   0:41.71 sftp-server
 4071 root      10  -5     0    0    0 S  0.3  0.0   2:30.25 nfsiod
21016 daniel    15   0 62052 3076 2168 D  0.3  0.0   0:26.26 sftp-server
21111 daniel    15   0 61284 3100 2168 D  0.3  0.0   0:23.25 sftp-server
21144 daniel    15   0 61284 3100 2168 D  0.3  0.0   0:23.47 sftp-server
21230 daniel    15   0 61288 3104 2168 S  0.3  0.0   0:23.18 sftp-server
21297 daniel    15   0 61284 3100 2168 D  0.3  0.0   0:23.04 sftp-server
21363 daniel    16   0 61280 3096 2168 D  0.3  0.0   0:23.26 sftp-server
21450 daniel    15   0 61280 3096 2168 D  0.3  0.0   0:23.15 sftp-server
21457 daniel    15   0 61288 3112 2172 D  0.3  0.0   0:22.32 sftp-server
21521 daniel    16   0 61292 3112 2172 D  0.3  0.0   0:22.61 sftp-server
21559 daniel    15   0 62052 3076 2168 D  0.3  0.0   0:23.47 sftp-server
22504 antonio   15   0 30996 2352 1552 R  0.3  0.0   0:00.07 top

parece que han sido todos los procesos de copia, que se han quedado colgados, esperaremos a ver que sucede y vemos sin terminan sin problema y nos aseguramos que el rpciod no se queda colgado.

[continuara ...]