Class: Seo::CannibalizationService

Inherits:
BaseService show all
Defined in:
app/services/seo/cannibalization_service.rb

Overview

Detects keyword cannibalization across all pages in the site.
Cannibalization occurs when multiple pages compete for the same keyword,
potentially diluting ranking power and confusing search engines.

Examples:

Find all cannibalization issues

Seo::CannibalizationService.new.process

Analyze specific keyword

Seo::CannibalizationService.new.analyze_keyword('heated floors')

Constant Summary collapse

AT_RISK_POSITIONS =

Position range considered at risk for cannibalization
Pages ranking 5-30 are most vulnerable as they're competing

(5..30).freeze
MIN_COMPETING_PAGES =

Minimum number of pages ranking for same keyword to flag

2

Instance Method Summary collapse

Methods inherited from BaseService

#log_debug, #log_error, #log_info, #log_warning, #logger, #options, #tagged_logger

Constructor Details

#initialize(options = {}) ⇒ CannibalizationService

Returns a new instance of CannibalizationService.



22
23
24
25
# File 'app/services/seo/cannibalization_service.rb', line 22

def initialize(options = {})
  super
  @min_search_volume = options[:min_search_volume] || 50
end

Instance Method Details

#analyze_keyword(keyword) ⇒ Hash?

Analyze a specific keyword for cannibalization

Parameters:

  • keyword (String)

    The keyword to analyze

Returns:

  • (Hash, nil)

    Cannibalization details or nil if none



51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
# File 'app/services/seo/cannibalization_service.rb', line 51

def analyze_keyword(keyword)
  pages = SeoPageKeyword.where(keyword: keyword)
                        .ranking
                        .includes(:site_map)

  return nil if pages.size < MIN_COMPETING_PAGES

  {
    keyword: keyword,
    search_volume: pages.maximum(:search_volume),
    pages: pages.map do |pk|
      {
        url: pk.site_map.url,
        category: pk.site_map.category,
        position: pk.position,
        traffic_share: pk.traffic_share
      }
    end.sort_by { |p| p[:position] || 999 }
  }
end

#for_page(site_map) ⇒ Array<Hash>

Get cannibalization issues for a specific page

Parameters:

  • site_map (SiteMap)

    The page to analyze

Returns:

  • (Array<Hash>)

    List of keywords being cannibalized



75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
# File 'app/services/seo/cannibalization_service.rb', line 75

def for_page(site_map)
  site_map.seo_page_keywords.at_risk.map do |pk|
    competitors = pk.competing_pages.at_risk.includes(:site_map)
    next if competitors.empty?

    {
      keyword: pk.keyword,
      this_position: pk.position,
      search_volume: pk.search_volume,
      competitors: competitors.map do |cpk|
        {
          url: cpk.site_map.url,
          category: cpk.site_map.category,
          position: cpk.position
        }
      end
    }
  end.compact
end

#processArray<Hash>

Find all cannibalization issues site-wide

Returns:

  • (Array<Hash>)

    List of cannibalization issues



29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
# File 'app/services/seo/cannibalization_service.rb', line 29

def process
  @logger.info '[CannibalizationService] Analyzing keyword cannibalization...'

  issues = competing_keywords.filter_map do |entry|
    next if entry[:pages].size < MIN_COMPETING_PAGES

    {
      keyword: entry[:keyword],
      locale: entry[:locale],
      search_volume: entry[:pages].first[:search_volume],
      pages: entry[:pages].sort_by { |p| p[:position] || 999 }
    }
  end

  @logger.info "[CannibalizationService] Found #{issues.size} cannibalization issues"

  issues.sort_by { |i| -(i[:search_volume] || 0) }
end

#recommendations(issues) ⇒ Array<Hash>

Get recommended actions for cannibalization issues

Parameters:

  • issues (Array<Hash>)

    Cannibalization issues from #process or #analyze_keyword

Returns:

  • (Array<Hash>)

    Recommendations



98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
# File 'app/services/seo/cannibalization_service.rb', line 98

def recommendations(issues)
  issues.map do |issue|
    pages = issue[:pages]
    best_page = pages.min_by { |p| p[:position] || 999 }
    other_pages = pages - [best_page]

    {
      keyword: issue[:keyword],
      search_volume: issue[:search_volume],
      recommendation: build_recommendation(best_page, other_pages),
      best_page: best_page,
      consolidate_pages: other_pages
    }
  end
end