How to Evaluate a Marketing Agency's Portfolio

Learn exactly how to evaluate a marketing agency's portfolio—from case study quality to red flags and a scoring rubric you can use before signing any contract.

Choosing a marketing agency is one of the most consequential decisions a brand can make. Yet most businesses approach portfolio reviews the same way they scroll social media—quickly, passively, and without a framework. A polished website and a handful of impressive logos do not tell you whether an agency can actually deliver results for your business.

This guide gives you a structured method to evaluate any agency's portfolio, identify genuine capability, spot red flags before they cost you money, and use a practical scoring rubric to compare multiple agencies side by side.


Why Portfolio Evaluation Matters More Than Credentials

An agency's portfolio is its most honest self-portrait. Credentials, awards, and case study decks are curated—but the underlying work rarely lies. A portfolio reveals strategic thinking, creative range, technical execution, and how an agency measures success.

In Myanmar's growing digital marketing landscape, where around 280 registered agencies operate (76% of them based in Yangon), many firms are relatively young—the average agency age is just 2.5 years. That means the portfolio is often the only reliable evidence of capability you have.


Quality vs. Quantity of Work

The first thing to resist is being impressed by volume. An agency showcasing 50 campaigns is not necessarily better than one showcasing eight well-documented ones.

What quality actually looks like

  • Each case study tells a coherent story: challenge, approach, execution, result
  • The work reflects a clear strategic brief, not just aesthetic choices
  • Results are specific, time-bound, and tied to business objectives

The problem with quantity

When a portfolio is heavy on samples but thin on context, it signals one of two things: the agency is padding to appear experienced, or they do not know which work to be proud of. Neither is reassuring.

Rule of thumb: Three well-documented case studies with measurable outcomes beat thirty pretty images with no context.


Industry Relevance and Category Experience

Not all experience transfers equally. An agency that has spent five years running campaigns for FMCG brands has learned a very different craft than one specializing in B2B software or hospitality.

Why it matters

  • Audience behavior, buying cycles, and messaging tone differ by industry
  • Regulatory constraints (finance, healthcare, food and beverage) require specialized knowledge
  • Platform mix often differs: FMCG brands lean heavily on Facebook and TikTok in Myanmar; B2B brands may need LinkedIn and email automation

How to assess it

Ask directly: "How many clients in our category have you worked with in the past two years?" Then ask to speak with one. An agency confident in its vertical experience will welcome this.

If your category is underrepresented in their portfolio, that is not automatically disqualifying—but you should understand why and assess whether their adjacent experience translates.


Measurable Results vs. Vanity Metrics

This is where many agencies expose themselves. Portfolio case studies that lead with follower counts, impressions, and "viral" posts are showcasing vanity metrics—numbers that look impressive but do not connect to business outcomes.

What to look for instead

Vanity Metric Business Metric to Ask For
10,000 new followers Cost per new follower vs. revenue attributed to social channel
2M impressions Reach among target audience + frequency
"Viral" campaign Engagement rate, website traffic spike, conversion lift
#1 trending hashtag Share of voice vs. competitors, brand recall change
500 post shares Lead generation, email sign-ups, or sales influenced

The right question to ask

"What was the business outcome of this campaign, and how did you measure it?" If the answer is vague—"the client was really happy" or "it got great engagement"—the agency is not a results-oriented shop.


Creative Consistency and Brand Adherence

Strong agencies do not impose their own aesthetic on every client. They develop creative that serves each brand's identity, voice, and audience.

Signs of genuine creative discipline

  • You can identify each brand's visual language without seeing the logo
  • Copy tone shifts appropriately between a luxury brand and a youth-focused FMCG
  • Design systems are coherent across multiple touchpoints (social, out-of-home, digital ads, web)

Signs of lazy creative

  • All case studies have the same visual style regardless of client
  • Photography is generic or clearly sourced from stock libraries without customization
  • Headlines read like templates rather than tailored brand voices

Stock imagery is not inherently a problem for certain uses—but an agency that relies on unmodified stock photos for campaign hero images is signaling limited creative investment.


Platform Diversity

Myanmar's digital landscape spans Facebook (still the dominant platform with 25M+ users), TikTok (the fastest-growing), Instagram, YouTube, and increasingly programmatic display. A capable agency should demonstrate competency across multiple channels.

What to check

  • Does the portfolio include work across at least three platforms?
  • Is there evidence of platform-native content (Reels vs. static posts vs. TikTok-first video)?
  • Does the agency understand how content formats differ between platforms—not just resize assets?

An agency that only shows Facebook carousel ads likely lacks the creative and strategic range to run an integrated campaign.


Before-and-After Comparisons

The most compelling portfolio entries show transformation. This applies to both creative work (rebrands, website redesigns) and performance work (campaign results).

What good before-and-after looks like

  • Rebrand case studies: Previous brand identity with clear diagnosis of its weaknesses, new identity with rationale for every decision
  • Performance case studies: Baseline metrics clearly stated (organic traffic, CPC, conversion rate) before the engagement, with documented improvement over a defined period
  • Website redesigns: Old user journey mapped against new one, with UX rationale and post-launch performance data

If an agency cannot articulate the starting point, they cannot prove they improved anything.


Client Testimonials and References

Testimonials on an agency's own website are the least reliable signal—they are curated and unverifiable. What matters more:

How to go beyond curated testimonials

  1. Ask for three references you can contact directly, not names on a page
  2. Look for the agency on LinkedIn and search for former clients who may have posted about the relationship
  3. Check Campaign Asia, The Drum, and SABRE award citations—these often include named client representatives who can be contacted
  4. Search Google and Facebook for the agency name + "review" or the agency name + their client's brand name

One candid five-minute phone call with a former client is worth more than a page of testimonials.


Award Recognition as a Signal

Industry awards are not the only measure of quality, but they are a credible third-party signal. The most relevant recognition for agencies operating in or around Myanmar includes:

  • Campaign Asia Agency of the Year (AOY) – the most prestigious regional benchmark; shortlisting alone signals competitive capability
  • The Drum Awards – strong for digital and creative work across Asia-Pacific
  • SABRE Awards Asia-Pacific – specialist recognition for PR and communications work

How to interpret awards

  • Shortlisted but not won: Still meaningful—it means independent judges found the work worth evaluating
  • Won a category: Significant, especially in competitive categories
  • Self-submitted industry lists: Lower signal—many "Top 10 Agency" lists are pay-to-play

Ask the agency: "Which awards have you entered in the past two years, and what were the outcomes?" Their answer reveals both their ambition and their honesty.


Red Flags to Watch For

Even a well-presented portfolio can hide serious weaknesses. Watch for these warning signs:

Creative red flags

  • Heavy use of unmodified stock imagery in campaign work
  • All case studies from the same 1–2 clients (limited range)
  • Visual style identical across all brands (agency imposing its aesthetic)
  • No evidence of Burmese-language creative for Myanmar-focused campaigns

Results red flags

  • Results stated without baselines ("We grew their following by 200%"—from what?)
  • No time frames given for results
  • Only impressions and reach cited, never conversions or revenue
  • "NDA prevents us from sharing details" used for every single case study

Process red flags

  • No mention of strategy, research, or audience insights in any case study
  • Work samples only; no explanation of brief, objective, or approach
  • Unable to name a single client you can contact as a reference

Portfolio Scoring Rubric

Use this rubric to compare agencies objectively. Score each dimension from 1 (weak) to 5 (excellent).

Dimension Score (1–5) Notes
Case study depth (strategy + execution + results)
Industry/category relevance
Use of business metrics, not vanity metrics
Creative range across brands and platforms
Platform diversity
Before-and-after evidence
Client references available
Award recognition
Burmese-language/local market capability
Absence of red flags
Total (out of 50)

Interpreting your scores:

  • 40–50: Strong portfolio; proceed to proposal stage
  • 30–39: Promising but validate with references before committing
  • 20–29: Significant gaps; probe weaknesses in discovery meeting
  • Below 20: Proceed with caution or look elsewhere

Use the same rubric for every agency you evaluate so you are comparing on consistent criteria.


How to Request the Right Materials

When you ask an agency for portfolio materials, be specific:

"Please share three to five case studies most relevant to [your industry/objective], each including the original brief or business challenge, the strategy and approach taken, specific platforms or channels used, measurable results with baselines and time frames, and a client contact we can speak with."

An agency that responds with a generic pitch deck has not read your request carefully. That itself tells you something.


Frequently Asked Questions

Q: How many case studies should I review before making a decision? Review at least three to five in depth, prioritizing those closest to your industry, budget, and campaign type. Surface-level browsing of twenty samples is less useful than a thorough analysis of five.

Q: What if an agency says they cannot share results due to NDAs? Occasional NDAs are legitimate. But if every result is NDA-protected, it is a red flag—either the results are not good, or the agency has not cultivated relationships where clients are willing to vouch for them. Ask if they can share anonymized data or arrange a confidential reference call.

Q: Should I prioritize local Myanmar agencies or regional ones? Both can work depending on your needs. Local agencies understand Myanmar consumer behavior, platform nuances, and Burmese-language creative. Regional agencies may bring stronger capabilities in paid media, SEO, or specialized verticals. The portfolio should tell you whether their experience matches your market, regardless of headquarters.

Q: How much does award recognition matter if the agency is small or new? For small or newer agencies, entries and shortlistings matter more than wins. An agency that invests in entering Campaign Asia AOY or The Drum Awards—even without winning—is signaling professional ambition and willingness to be evaluated against peers. That is a positive signal.

Q: Can I ask to see work that was not selected for the official portfolio? Yes, and this is a smart move. Ask to see a campaign that did not perform as expected and how the agency responded. Their answer reveals far more about their character, learning culture, and transparency than their best work ever will.