Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concatenate errors during parsing #14

Closed
wants to merge 2 commits into from

Conversation

peverwhee
Copy link
Owner

@mwaxmonsky you can ignore this (for now!); opening this PR to get some feedback from @nusbaume on how to do this... better.

@peverwhee peverwhee requested a review from nusbaume February 21, 2024 18:56
@mwaxmonsky
Copy link
Collaborator

So one thing that I thought about far after the fact of looking at this was it might make sense to use an observer pattern (https://www.geeksforgeeks.org/observer-method-python-design-patterns/) similar to how logging is handled in python (https://docs.python.org/3/howto/logging.html#configuring-logging)

So we might be able to implement something that allows for:

h = handler.setup('parse_errors')
# Call parsing code
if not h.empty()
  # print all errors

then in the called code:

# parse code
# if error detected:
  h = handler.global('parse_errors')
  h.addError(...)
  # determine exit patter

@mwaxmonsky
Copy link
Collaborator

Okay, I think we have a proof of concept that we can try. In the main python file where we start processing files, we can have:

import logging
import logging.handlers
import queue

...

#in processing function
  message_queue = queue.Queue(-1)
  queue_handler = logging.handlers.QueueHandler(message_queue)
  stream_handler = logging.StreamHandler()
  queue_listener = logging.handlers.QueueListener(message_queue, stream_handler)
  metadata_processing_logger = logging.getLogger("metadata_processing_logger")
  metadata_processing_logger.addHandler(queue_handler)

  #call processing functions

  if message_queue.qsize() > 0:
    #determine how to handle errors
    
    #These will start printing messages to stdout
    queue_listener.start()
    queue_listener.stop()

The in the different parsing code segments, we can have:

import logging

# in analyze() function
  metadata_processing_logger = logging.getLogger("metadata_processing_logger")
  metadata_processing_logger.error(our_error_message)

Just let me know if you need any help testing this out and I'd be happy to help!

@nusbaume
Copy link
Collaborator

This is exactly what I was looking for, thanks @mwaxmonsky! @peverwhee let me know if you have any problems implementing this solution, or if it ends up being more complex than anticipated.

Thanks again!

@peverwhee
Copy link
Owner Author

closing - preempted by another solution (related to #15)

@peverwhee peverwhee closed this Apr 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants