Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #8627
    A-D-T
    Participant

    hi!

    I just encountered a problem with large files and the find-replace function:
    I’ve got a big file with different datasets like

    1|asdf|12348|asdf|….
    2|1234855|asdf|238….
    3|238ahsh|asdf|238….
    and so on

    I just want to have all datasets starting with “2|”
    So I delete every other dataset with the regex “^[1345]|*$”

    This works well and leaves me a big file with only “2|” datasets and many empty lines. To optimize the onging use of the new file i also want to delete the empty lines with “^n”. And now comes the problem: It works but is AWFULLY SLOW (aprox. 1\% of the file per 30 seconds).

    Am I doing something wrong or is this a problem within EmEditor/Regex++? Is there another way to get rid of the empty lines?

    cheers
    A-D-T

    #8630
    Yutaka Emura
    Keymaster

    Hello A-D-T,

    I am sorry but when EmEditor opens a large file using a temporary file (larger than specified size in the Advanced tab of Customize dialog box), it becomes slow when it needs to add or remove lines. If possible, you can increase this size so that it can be a little larger than the size of the file you are trying to open.

    Please let me know if you have further questions.
    Thank you,

    #8642
    A-D-T
    Participant

    ok. That explains it!
    By raising the size it worked fine.
    thx!

Viewing 3 posts - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.