<div class="gmail_quote">On Mon, Aug 2, 2010 at 4:31 PM, Jorge Williams <span dir="ltr"><<a href="mailto:jorge.williams@rackspace.com">jorge.williams@rackspace.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div style="word-wrap:break-word"><div><div>Hmm.. Let me see if I understand what you're saying. Correct me if I'm wrong here. You're still advocating a proxy approach where an HTTP request is sent from one proxy to another... (Parton my txt drawings)</div>
<div><font class="Apple-style-span" face="'Courier New'" size="3"><span class="Apple-style-span" style="font-size: 11px;"><b><font class="Apple-style-span" face="arial"><span class="Apple-style-span" style="font-weight: normal; font-size: small;"><br>
</span></font></b></span></font></div><div>...but you are proposing that individual proxies can make sideways calls to make additional service requests...</div><div><br></div><div><div>If so, that's exactly along the lines of what I was thinking. </div>
</div></div></div></blockquote><div><br></div><div>Yep. I hadn't argued for putting a cache as its own layer, but I think the bigger picture is the same -- a stack of proxies that can call out sideways.</div><div> </div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><div style="word-wrap:break-word"><div><div><br></div><div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> [ SSL Term ] </span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | </span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> v </span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> +------------>[ Cache ]</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | v</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | [ Auth ]--->[ IDM SERVICE ]</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | v</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | [ Rate Limit ]</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> (Purge) |</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | v</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | [API Endpoint]</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | v</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | ...</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div>
<div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> | |</span></font></b></div><div><b><font face="'Courier New'" size="3"><span style="font-size:11px"> +-------------------+</span></font></b></div>
<div><br></div></div><div><br></div><div>For example, say a user issues a command to delete a server. We'll need to purge every representation (XML, JSON, XML GZip, JSON GZip) of that server from the front end cache. I suppose we could detect the delete operation at the caching stage, but that means having a very customized cache, I'd like to reuse that code for different APIs. What's more certain events may trigger cache purges from the backend directly, say a server transition state from "RESIZE" to "ACTIVE". I really don't see how we can avoid these downstream communications entirely.</div>
</div></div></blockquote><div><br></div><div>If we made the cache a proxy layer then I would agree that communication would loop upstream again. If we made the cache a service, accessible from any proxy via an API, then we avoid that complexity. So I'd definitely argue for the latter approach for caching (as does Eric). As a counterexample, I'd keep rate limiting as its own layer, since it's needed in one place and isn't called regularly by anyone downstream.</div>
<div><br></div><div>Michael</div><div><br></div><div><br></div></div>