Subject: Re: df inside guest show incorrect disk quota
From: Steve Kieu <msh.computing@gmail.com>
Date: Mon, 20 Feb 2012 09:40:58 +1100
Mon, 20 Feb 2012 09:40:58 +1100
HI again

I found it, it is because the lv for all vserver is full :-(

thanks


On Mon, Feb 20, 2012 at 9:24 AM, Steve Kieu <msh.computing@gmail.com> wrote:

> Hello,
>
> I am having strange problems. At the host (kernel and vserver version
> 2.6.37.6-vs2.3.0.37-rc5-h1cpu16 - util vserver: 0.30.216_pre3004)
>
> vdu --xid test-splunk1 --space test-splunk1
> test-splunk1 1493848
>
> vdu --xid test-splunk1 --inodes test-splunk1
> test-splunk1 28626
>
> But inside the guest run df show
>
>  df -h
> Filesystem            Size  Used Avail Use% Mounted on
> /dev/hdv1              20G  9.9G  1.7G  86% /
> none                  7.9G     0  7.9G   0% /dev/shm
>
> Inode reported correctly though
> df -i
> Filesystem            Inodes   IUsed   IFree IUse% Mounted on
> /dev/hdv1             341333   28626  312707    9% /
> none                 2054249       1 2054248    1% /dev/shm
>
>
> Why was that happened and what to do to fix it. The apps inside think
> there is only 1.7G available and stop working. Please help .... I am
> running out of ideas at the moment
>
>
>
>
>
>
> --
> Steve Kieu
>



-- 
Steve Kieu


HI again 

I found it, it is because the lv for all vserver is full :-(

thanks


On Mon, Feb 20, 2012 at 9:24 AM, Steve Kieu <msh.computing@gmail.com> wrote:
Hello,

I am having strange problems. At the host (kernel and vserver version 2.6.37.6-vs2.3.0.37-rc5-h1cpu16 - util vserver: 0.30.216 pre3004)

vdu --xid test-splunk1 --space test-splunk1
test-splunk1 1493848

vdu --xid test-splunk1 --inodes test-splunk1
test-splunk1 28626

But inside the guest run df show

 df -h
Filesystem            Size  Used Avail Use% Mounted on
/dev/hdv1              20G  9.9G  1.7G  86% /
none                  7.9G     0  7.9G   0% /dev/shm

Inode reported correctly though
df -i
Filesystem            Inodes   IUsed   IFree IUse% Mounted on
/dev/hdv1             341333   28626  312707    9% /
none                 2054249       1 2054248    1% /dev/shm


Why was that happened and what to do to fix it. The apps inside think there is only 1.7G available and stop working. Please help .... I am running out of ideas at the moment






--
Steve Kieu



--
Steve Kieu