Skip to content
Alex Morega edited this page Oct 14, 2016 · 2 revisions

Let's say a news team used hoover to search through a private collection. They found a few interesting documents, and they want to publish an article, so they need to link/embed those documents, to support their case. They can upload them to some public repository - dropbox, google drive, scribd, etc - or their own website. But what if they could use hoover for this?

They would create a new public collection, just with the documents they wish to publish, either on the same hoover installation, or a separate instance. Then, they need a way to embed the documents, or at least link to them.

  • The server needs to handle large amounts of traffic. We can put an HTTP cache in front of it, but we need to make sure the responses from hoover are cachable.
  • If they link to the raw documents, is it possible to get back to the document preview, or the search homepage? If not, would it make sense for them to say, at the end of the article, "by the way, here's a search engine for all our published documents"? We need to be extra sure that the search api we expose from elasticsearch is not hackable.
  • Do we provide an embed widget or iframe code? Should hoover-search return a special embed view for a document?
  • Should we add embedding instructions to the document preview page?
Clone this wiki locally