[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [IMP-dev] removing modeldata



Here is a clean patch.

Attachment: model_data
Description: Binary data





As for the check standards, it works just fine in a clean checkout of IMP, so there is state somewhere in IMP that is tripping it up. "svn status" doesn't show any interestingly different files. Further running check_standards.py on itself (or another single file) bombs, so it is not a matter of finding random files. Any ideas?

Time to sleep.


On Feb 26, 2008, at 11:38 PM, Daniel Russel wrote:


On Feb 26, 2008, at 10:18 PM, Ben Webb wrote:

Daniel Russel wrote:
I wanted some easy coding so I made a quick pass at removing
ModelData.

It doesn't compile on my Mac:
The mac gcc completely ignores the friend declaration. Just make
everything public in FloatIndex for now. It is protected in Optimizer
anyway.



Anyway, the tests pass. Check standards still bombs, so I don't
know if
it passes that.

No, it doesn't. I don't know why it doesn't work on your system,
because
it runs fine on every machine I've tried it on. It's not like it's
doing
anything complicated, anyway - just reading a file line by line, and
applying a handful of regexes. So I suspect you have a screwed up
Python
path or something on your machine.
I get the same errors on my mac and on flute. And without the
PYTHONPATH being set to anything.

Traceback (most recent call last):
  File "tools/check-standards.py", line 81, in <module>
    main()
  File "tools/check-standards.py", line 72, in main
    check_modified_file(filename, errors)
  File "tools/check-standards.py", line 47, in check_modified_file
    check_python_file(filename, errors)
  File "tools/check-standards.py", line 36, in check_python_file
    if r.run():
  File "/Users/drussel/src/IMP/tools/reindent.py", line 166, in run
    tokenize.tokenize(self.getline, self.tokeneater)
  File "/System/Library/Frameworks/Python.framework/Versions/2.5/lib/
python2.5/tokenize.py", line 153, in tokenize
    tokenize_loop(readline, tokeneater)
  File "/System/Library/Frameworks/Python.framework/Versions/2.5/lib/
python2.5/tokenize.py", line 159, in tokenize_loop
    for token_info in generate_tokens(readline):
  File "/System/Library/Frameworks/Python.framework/Versions/2.5/lib/
python2.5/tokenize.py", line 283, in generate_tokens
    raise TokenError, ("EOF in multi-line statement", (lnum, 0))
tokenize.TokenError: ('EOF in multi-line statement', (484, 0))

It is probably a problem parsing some non-imp file floating around in
the source tree. "scons standards" tends to process random files it
finds lying around. There are no extra .py, .h or .cpp files around,
so I am not sure what it chokes on.
_______________________________________________
IMP-dev mailing list

https://salilab.org/mailman/listinfo/imp-dev