【Ceph】Ceph错误记录

unable to open OSD superblock on /var/lib/ceph/osd/ceph-45

key: use_rdma, val: 1
default pool attr 0, nr_hugepages 0, no neet set hugepages
2021-03-28 21:46:07.548765 7f904d555ec0 2818102 20 ERROR bluestore(/var/lib/ceph/osd/ceph-45/block) _read_bdev_label failed to open /var/lib/ceph/osd/ceph-45/block: (2) No such file or directory
2021-03-28 21:46:07.548788 7f904d555ec0 2818102 20 ERROR  ** ERROR: unable to open OSD superblock on /var/lib/ceph/osd/ceph-45: (2) No such file or directory

 OSD 号不对,ceph osd tree 查看

[root@rdma61 osd]# ceph osd tree
ID  CLASS WEIGHT    TYPE NAME                   STATUS REWEIGHT PRI-AFF 
-10               0 root maintain                                       
 -9       123.71407 root hddpool                                        
-11       123.71407     rack rack.hddpool                               
-15        36.38649         host rdma61.hddpool                         
  3   hdd   7.27730             osd.3             down  1.00000 1.00000 
  6   hdd   7.27730             osd.6             down  1.00000 1.00000 
 10   hdd   7.27730             osd.10            down  1.00000 1.00000 
 13   hdd   7.27730             osd.13            down  1.00000 1.00000 
 16   hdd   7.27730             osd.16            down  1.00000 1.00000 
 -6        43.66379         host rdma63.hddpool                         
  1   hdd   7.27730             osd.1             down  1.00000 1.00000 
  4   hdd   7.27730             osd.4             down  1.00000 1.00000 
  7   hdd   7.27730             osd.7             down  1.00000 1.00000 
  9   hdd   7.27730             osd.9             down  1.00000 1.00000 
 12   hdd   7.27730             osd.12            down  1.00000 1.00000 
 15   hdd   7.27730             osd.15            down  1.00000 1.00000 
 -3        43.66379         host rdma64.hddpool                         
  2   hdd   7.27730             osd.2             down  1.00000 1.00000 
  5   hdd   7.27730             osd.5             down  1.00000 1.00000 
  8   hdd   7.27730             osd.8             down  1.00000 1.00000 
 11   hdd   7.27730             osd.11            down  1.00000 1.00000 
 14   hdd   7.27730             osd.14            down  1.00000 1.00000 
 17   hdd   7.27730             osd.17            down  1.00000 1.00000 
 -5        10.47839 root ssdpool                                        
-25        10.47839     rack rack.ssdpool                               
-28         3.49280         host rdma61.ssdpool                         
 18   ssd   0.87320             osd.18            down  1.00000 1.00000 
 21   ssd   0.87320             osd.21            down  1.00000 1.00000 
 24   ssd   0.87320             osd.24            down  1.00000 1.00000 
 28   ssd   0.87320             osd.28            down  1.00000 1.00000 
-31         3.49280         host rdma63.ssdpool                         
 20   ssd   0.87320             osd.20            down  1.00000 1.00000 
 22   ssd   0.87320             osd.22            down  1.00000 1.00000 
 25   ssd   0.87320             osd.25            down  1.00000 1.00000 
 27   ssd   0.87320             osd.27            down  1.00000 1.00000 
-34         3.49280         host rdma64.ssdpool                         
 19   ssd   0.87320             osd.19            down  1.00000 1.00000 
 23   ssd   0.87320             osd.23            down  1.00000 1.00000 
 26   ssd   0.87320             osd.26            down  1.00000 1.00000 
 29   ssd   0.87320             osd.29            down        0 1.00000 
 -1               0 root default  

fio 报错

[root@localhost ~]# ./fio --ioengine=rbd --iodepth=4 --numjobs=8 --pool=.rbdpool.rbd --rbdname=lun0 --name=write5 --rw=randwrite --bs=1M --size=10G --group_reporting --direct=1
write5: (g=0): rw=randwrite, bs=1M-1M/1M-1M/1M-1M, ioengine=rbd, iodepth=4
...
fio-2.1.10
Starting 8 processes
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.
rbd engine: RBD version: 1.12.0
rados_ioctx_create failed.
fio_rbd_connect failed.

OSD 池不对
 

猜你喜欢

转载自blog.csdn.net/bandaoyu/article/details/115288726