[Swift-devel] Issues when running swift on fuse

yizhu yizhu at cs.uchicago.edu
Fri May 29 13:09:13 CDT 2009


Mihael Hategan wrote:
> That's strange. You are getting multiple errors, both for the log file
> and the restart log. But the restart log should be small (and for
> first.swift, so should be the execution log). So I suspect the error
> message is misleading.
Yes, i think you are right, I noticed that swift is trying to generate 
log file with size more that 18446744073GB, which cause the file two 
large problem.


> 
> Can you provide more details? Such as:
> - the actual size of the files in question at the time the error(s)
> occurred
for a normal execution of first.swift code, it will will generate the 
files as follow;

[torqueuser at ip-10-251-89-208 test]$ pwd
/mnt/ebs/test
[torqueuser at ip-10-251-89-208 test]$ ls -l
-rw-rw-r--  1 torqueuser torqueuser 25141 May 29 12:33 
first-20090529-1233-nj1nazbf.log
-rw-rw-r--  1 torqueuser torqueuser 25406 May 29 12:34 
first-20090529-1234-bgzd13z4.log
-rw-rw-r--  1 torqueuser torqueuser    14 May 29 12:34 hello.txt
-rw-rw-r--  1 torqueuser torqueuser   114 May 29 12:34 swift.log

but when I try to run it on s3 (mounted via s3-fuse), the error occurs 
is following files is generated:

[torqueuser at ip-10-251-89-208 abc]$ pwd
/mnt/s3/abc

[torqueuser at ip-10-251-89-208 abc]$ ls -l
total 2
-rw-rw-r--  1 torqueuser torqueuser 18446744073709551615 May 29 12:35 
first-20090529-1235-xjyfu9r4.0.rlog
-rw-rw-r--  1 torqueuser torqueuser 18446744073709551615 May 29 12:35 
first-20090529-1235-xjyfu9r4.log
-rw-rw-r--  1 torqueuser torqueuser                   14 May 29 12:35 
hello.txt
-rw-rw-r--  1 torqueuser torqueuser 18446744073709551615 May 29 12:35 
swift.log
[torqueuser at ip-10-251-89-208 abc]$






> - the motivation behind having the swift working directory (as opposed
> to just the data) on S3
> - any relevant quotas or limitations of the S3 fs (output of 'quota' if
> applicable and 'df').

no output of 'quota',  'df'is (/mnt/s3 is the s3 directory :

[torqueuser at ip-10-251-89-208 s3]$ df
Filesystem           1K-blocks      Used Available Use% Mounted on
/dev/sda1              4128448   2744188   1174548  71% /
none                    873880         0    873880   0% /dev/shm
/dev/sda2            153899044   1999124 144082296   2% /mnt
/dev/sdf              10321208    290468   9506452   3% /mnt/ebs
fuse                 274877906944         0 274877906944   0% /mnt/s3


normally, we have create a single file sized around 1G with no problem, 
  it seems there is something wrong when swift try to generate log on 
s3fuse.  (I am using swift-0.9)

Then I tried to run swift on locally but with output file write to s3 
directory, then it works correctly.(log file is stored on local 
directory where i run swift, and output file is on s3 directory)




> On Fri, 2009-05-29 at 02:13 -0500, yizhu wrote:
>> Hi,
>>
>>
>> I tried to run a swift on a directory mounted by fuse (s3fs), but got 
>> the following error:
>> (-debug output attached)
>>
>> since I execute swift on /mnt/s3, the output file should be write 
>> there,i had rw permission on /mnt/s3, the job seems run correctly (the 
>> output file: Hello.txt is there), just a issue with log file.
>>
>> Is there a way to turn off the logging or any ideas to solve this problem?
>>
>> Thank you very much!
>>
>>
>> -From
>>
>> Yi Zhu
>>
>>
>>
>> [torqueuser at ip-10-251-89-208 ~]$ cd /mnt/s3
>> [torqueuser at ip-10-251-89-208 s3]$ swift 
>> /home/torqueuser/swift-0.9/examples/swift/first.swift
>> log4j:ERROR Failed to flush writer,
>> java.io.IOException: File too large
>> 	at java.io.FileOutputStream.writeBytes(Native Method)
>> 	at java.io.FileOutputStream.write(FileOutputStream.java:260)
>> 	at sun.nio.cs.StreamEncoder$CharsetSE.writeBytes(StreamEncoder.java:336)
>> 	at 
>> sun.nio.cs.StreamEncoder$CharsetSE.implFlushBuffer(StreamEncoder.java:404)
>> 	at sun.nio.cs.StreamEncoder$CharsetSE.implFlush(StreamEncoder.java:408)
>> 	at sun.nio.cs.StreamE.ncoder.flush(StreamEncoder.java:152)
>> 	at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:213)
>> 	at org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:49)
>> 	at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:306)
>> 	at org.apache.log4j.WriterAppender.append(WriterAppender.java:150)
>> 	at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:221)
>> 	at 
>> org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:57)
>> 	at org.apache.log4j.Category.callAppenders(Category.java:187)
>> 	at org.apache.log4j.Category.forcedLog(Category.java:372)
>> 	at org.apache.log4j.Category.debug(Category.java:241)
>> 	at org.griphyn.vdl.karajan.Loader.main(Loader.java:75)
>> Swift 0.9 swift-r2860 cog-r2388
>>
>> RunID: 20090529-0204-55gjsnhc
>> Progress:  uninitialized:1
>> Progress:  Submitted:1
>> Progress:  Active:1
>> Ex098
>> org.globus.cog.karajan.workflow.KarajanRuntimeException: Exception 
>> caught while writing to log file
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.restartLog.LogVargOperator.update(LogVargOperator.java:40)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.functions.VariableArgumentsOperator.append(VariableArgumentsOperator.java:38)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.functions.VariableArgumentsOperator.appendAll(VariableArgumentsOperator.java:44)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.functions.VariableArgumentsOperator.merge(VariableArgumentsOperator.java:34)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.SequentialChoice.commitBuffers(SequentialChoice.java:52)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.SequentialChoice.childCompleted(SequentialChoice.java:41)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.Sequential.notificationEvent(Sequential.java:33)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.SequentialChoice.notificationEvent(SequentialChoice.java:66)
>> 	at org.globus.cog.karajan.workflow.nodes.FlowNode.event(FlowNode.java:332)
>> 	at org.globus.cog.karajan.workflow.events.EventBus.send(EventBus.java:125)
>> 	at 
>> org.globus.cog.karajan.workflow.events.EventBus.sendHooked(EventBus.java:99)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.fireNotificationEvent(FlowNode.java:176)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.complete(FlowNode.java:296)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowContainer.post(FlowContainer.java:58)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.Sequential.startNext(Sequential.java:51)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.Sequential.childCompleted(Sequential.java:45)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.Sequential.notificationEvent(Sequential.java:33)
>> 	at org.globus.cog.karajan.workflow.nodes.FlowNode.event(FlowNode.java:332)
>> 	at org.globus.cog.karajan.workflow.events.EventBus.send(EventBus.java:125)
>> 	at 
>> org.globus.cog.karajan.workflow.events.EventBus.sendHooked(EventBus.java:99)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.fireNotificationEvent(FlowNode.java:176)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.complete(FlowNode.java:296)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowContainer.post(FlowContainer.java:58)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.functions.AbstractFunction.post(AbstractFunction.java:46)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.AbstractSequentialWithArguments.childCompleted(AbstractSequentialWithArguments.java:192)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.Sequential.notificationEvent(Sequential.java:33)
>> 	at org.globus.cog.karajan.workflow.nodes.FlowNode.event(FlowNode.java:332)
>> 	at org.globus.cog.karajan.workflow.events.EventBus.send(EventBus.java:125)
>> 	at 
>> org.globus.cog.karajan.workflow.events.EventBus.sendHooked(EventBus.java:99)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.fireNotificationEvent(FlowNode.java:176)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.complete(FlowNode.java:296)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.functions.AbstractFunction.executeChildren(AbstractFunction.java:37)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowContainer.execute(FlowContainer.java:63)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.restart(FlowNode.java:233)
>> 	at org.globus.cog.karajan.workflow.nodes.FlowNode.start(FlowNode.java:278)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.FlowNode.controlEvent(FlowNode.java:391)
>> 	at org.globus.cog.karajan.workflow.nodes.FlowNode.event(FlowNode.java:329)
>> 	at 
>> org.globus.cog.karajan.workflow.FlowElementWrapper.event(FlowElementWrapper.java:227)
>> 	at org.globus.cog.karajan.workflow.events.EventBus.send(EventBus.java:125)
>> 	at 
>> org.globus.cog.karajan.workflow.events.EventBus.sendHooked(EventBus.java:99)
>> 	at 
>> org.globus.cog.karajan.workflow.events.EventWorker.run(EventWorker.java:69)
>> Caused by: java.io.IOException: File too large
>> 	at java.io.FileOutputStream.writeBytes(Native Method)
>> 	at java.io.FileOutputStream.write(FileOutputStream.java:260)
>> 	at sun.nio.cs.StreamEncoder$CharsetSE.writeBytes(StreamEncoder.java:336)
>> 	at 
>> sun.nio.cs.StreamEncoder$CharsetSE.implFlushBuffer(StreamEncoder.java:404)
>> 	at sun.nio.cs.StreamEncoder$CharsetSE.implFlush(StreamEncoder.java:408)
>> 	at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:152)
>> 	at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:213)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.restartLog.FlushableLockedFileWriter.flush(FlushableLockedFileWriter.java:39)
>> 	at 
>> org.globus.cog.karajan.workflow.nodes.restartLog.LogVargOperator.update(LogVargOperator.java:37)
>> 	... 40 more
>> Execution failed:
>> 	Exception caught while writing to log file
>> Caused by:
>> 	File too large
>> [torqueuser at ip-10-251-89-208 s3]$
>>
>>
>>
>> _______________________________________________
>> Swift-devel mailing list
>> Swift-devel at ci.uchicago.edu
>> http://mail.ci.uchicago.edu/mailman/listinfo/swift-devel
> 
> 




More information about the Swift-devel mailing list