Hi nicolas,
From what I've read on the topic disallow rule in the robots.txt file is not a guarantee that the url will not be shown in teh search results:
"When you block URLs from being indexed in Google via robots.txt, they may still show those pages as URL only listings in their search results." - this is from the link you provided.
Google also states that they have "workarounds" to detect and list disallowed by the robots.txt file pages.
https://support.google.com/webmasters/answer/6062608?hl=en
"You should not use robots.txt as a means to hide your web pages from Google Search results. This is because other pages might point to your page, and your page could get indexed that way, avoiding the robots.txt file. If you want to block your page from search results, use another method such as password protection or noindex tags or directives"
Is this the only option?