Highlighted
Absent Member.
Absent Member.
2280 views

Root Full

Yesterday while working on some training documents I noticed my Filr system continuously going down. I could get into the admin portion to reboot and shut down, but going to the normal login wouldn't work. After awhile I realized the root was at 99%.
I removed about two weeks worth of log files but, I'm concerned that it now sits at 92%. I removed the famtd, catalina, loacall host access, messages, and ssf logs. These are all the archived/old logs.

Is there some where else I can look? I'm concerned with the lack of space and everything in root.
0 Likes
7 Replies
Highlighted
Knowledge Partner
Knowledge Partner

Re: Root Full

abens;2263718 wrote:
Yesterday while working on some training documents I noticed my Filr system continuously going down. I could get into the admin portion to reboot and shut down, but going to the normal login wouldn't work. After awhile I realized the root was at 99%.
I removed about two weeks worth of log files but, I'm concerned that it now sits at 92%. I removed the famtd, catalina, loacall host access, messages, and ssf logs. These are all the archived/old logs.

Is there some where else I can look? I'm concerned with the lack of space and everything in root.


Is this a small deployment setup (so everything is on one appliance)?
Did you add a second disk in the VMware configuration before starting the appliance for the first time?

Are you using internal storage (meaning FILR only users vs. LDAP imported users with Home Directories/NetFolders defined?)

you can use regular linux commands to find out which directories have lots of space:

cd /
du -sxh *

It'll give you output of the top-level directories and their combined size. Then you have to drill down, executing the:
du -sxh *
command until you find the directory or directories with lots of stuff in them.
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

Small deployment, second disk in place, ldap users. Apparently there is a deeper issue because the Filr-Tomcat keeps shutting down and I'm out of disk space again and the java heap dump is trying to write so my system is just kind of sitting there. I've been on line for almost 4 hours waiting for Tech support to answer a chat session.

I did the du -h I'll try the other commands to see if I can see whats eating up disk space.
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

On Wed, 15 May 2013 19:26:02 GMT, abens
<abens@no-mx.forums.novell.com> wrote:

Hi,

Most likely you will have crash dump files from java, it does do eat
the system out big time.


>
> Small deployment, second disk in place, ldap users. Apparently there is
> a deeper issue because the Filr-Tomcat keeps shutting down and I'm out
> of disk space again and the java heap dump is trying to write so my
> system is just kind of sitting there. I've been on line for almost 4
> hours waiting for Tech support to answer a chat session.
>
> I did the du -h I'll try the other commands to see if I can see whats
> eating up disk space.


0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

Alex, you nailed part of it. After waiting 4.5 hours for an engineer to help figure this out and running du -h --max-depth=1 we found the ./tmp folder was 33 gb. In there we found 3 core dumps, 3 java dumps and 4 javaheap dumps from the day prior. I was also getting some strange memory leak errors in the famtd.log. My guess was that since the disk was full, and it couldn't write anything else it thought there was a memory leak. We have a monitor on this to see if it core dumps again.
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

I had a similar issue. Log files rather than dumps. Some 30G worth, bring my root to 91% and this was shutting down filr every hour when the hourly cron job checked the available storage.

Until this is resolved, being an appliance and all, I created a logs directory on /vastorage [I have a 1G additional partition], deleted the /var/opt/novell/tomcat-filr/logs directory and created a symbolic link so that my log files go to my additional storage and leaves root alone. [In my case it was
cd /var/opt/novell/tomcat-filr
ln -s /vastorage/logs'
]

I changed the owner of the /vastorage/logs to wwwrun:www

I'm now running with 33% available space and filr should stay up next time to hourly cron runs.
Hope this helps someone.
Regards
Eric.
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

Eric
Which log file because I found a TID 7012442 for the MySQL log's. Apparently by default it keeps log files indefinitely. This TID gives you steps in how to remove the MySql logs after a certain time period.

Al
0 Likes
Highlighted
Absent Member.
Absent Member.

Re: Root Full

abens;2266231 wrote:
Eric
Which log file because I found a TID 7012442 for the MySQL log's. Apparently by default it keeps log files indefinitely. This TID gives you steps in how to remove the MySql logs after a certain time period.

Al


Thanks for the tip,

It was mainly the ssf logs, full or file access errors such as
2013-06-06 00:42:16,667 ERROR [Sitescape_Worker-15] [com.novell.teaming.module.folder.impl.PlusFolderModule] - (full) Failed to process entity with path [/Home Workspace/Net Folders/Executive/HowtoDesignMeaningfulMeasures/How-to/HANDOUT sensory language.pdf]: org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: org.kablink.teaming.domain.Binder.attachments, no session or session was closed
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: org.kablink.teaming.domain.Binder.attachments, no session or session was closed

I suspect it's to do with the fact that we use Dynamic Storage Technology and some other errors, other people have already logged.
My sql log is tiny, and don't seem to be an issue at this stage.
Thanks
Eric.
0 Likes
The opinions expressed above are the personal opinions of the authors, not of Micro Focus. By using this site, you accept the Terms of Use and Rules of Participation. Certain versions of content ("Material") accessible here may contain branding from Hewlett-Packard Company (now HP Inc.) and Hewlett Packard Enterprise Company. As of September 1, 2017, the Material is now offered by Micro Focus, a separately owned and operated company. Any reference to the HP and Hewlett Packard Enterprise/HPE marks is historical in nature, and the HP and Hewlett Packard Enterprise/HPE marks are the property of their respective owners.